In Video and Image Analysis, imagine if a massive amount of data from video sensors is collected in theater, and there aren’t enough analysts or time available to review. Reducing the amount of data or the number of sensors isn’t the answer, and there will never be enough analysts. The solution lies in better automated capabilities that can identify areas and activities that require human analyst attention.

Video and Image Retrieval and Analysis Tool (VIRAT)USA Defense Advanced Research Projects Agency (DARPA) is now working on Video and Image Retrieval and Analysis Tool (VIRAT) and Persistent Stare Exploitation and Analysis System (PerSEAS) programs may soon enable better warfighter analysis of huge amounts of data generated from multiple types of sensors. “Bad guys do bad things, such as all the actions involved in burying an IED – so it is activity that matters. This is especially true when bad guys look, dress, and drive vehicles like those around them,” said Mita Desai, DARPA program manager for VIRAT and PerSEAS. “The analysis tools to find these activities and the underlying actions just don’t exist, which is why there is such interest in activity-based analysis and exploitation.”

The Video and Image Retrieval and Analysis Tool (VIRAT) program is developing a system that will provide military imagery analysts with important new capabilities. VIRAT will enable analysts to establish alerts that continuously query a real-time video stream to detect activities and events of interest as they occur. VIRAT will also provide tools that enable analysts to rapidly retrieve, with high precision and recall, video content from extremely large video libraries.

Text searching and algorithms for facial and other object recognition already exist. Up to now, finding actions of interest within previously untagged, raw video has been a resource drain and such a technical challenge as to seem ‘impossible’. Desai explained, “The objectives of VIRAT and PerSEAS are NOT to replace human analysts, but to make them more effective and efficient by reducing their cognitive load and enabling them to search for activities and threats quickly and easily.”

VIRAT is focused on full-motion video, from platforms such as Predator or Aerostats, allowing analysts to either monitor the live downlink for specific actions of interest or search an existing archive for past occurrences. There searches are conducted using a video clip as the input query. VIRAT finds actions that are short in duration and occur in small geographic areas. PerSEAS focuses on wide-area coverage, such as data from Constant Hawk, Gorgon Stare, ARGUS-IS and other persistent sensors. PerSEAS observes multiple actions over a long duration and large geographic regions to postulate complex threat activities. Algorithms from VIRAT provide some of the underlying capabilities within PerSEAS.