ISSN 2330-717X

AI System Finds And Predicts Criminal Patterns In Vast Troves Of Surveillance Footage


Security organisations and agencies around the world deploy increasing amounts of video surveillance to monitor and protect people, property and public infrastructure. The amount of available footage is exploding, thanks to growing numbers of cameras operating at higher resolutions, making it difficult for human surveillance teams to analyse all of the footage – especially if they have other tasks.

Automated surveillance could help but requires advanced analytics to be effective in fighting crime. Many organisations have invested heavily in surveillance systems and are keen to exploit the video footage they have collected to this end. The EU-funded SURVANT project created a system to do exactly this.

“SURVANT addresses system scalability issues that emerge from the explosion in the amount of available video content,” says Mr Giuseppe Vella, SURVANT project coordinator. SURVANT analyses relevant surveillance videos to extract inter/intra-camera video analytics, before enriching this information with reasoning and inference. It then assists investigators to search efficiently and effectively through video archives, to find critical elements of criminal activity among the sea of footage.

Smart Surveillance

The system uses deep learning algorithms – Convolutional Neural Networks and Recurrent Neural Networks – to analyse static and motion content, respectively. The algorithms sift through inter-camera tracking data and pick up recurring and dangerous subjects, finding the right balance between speed and accuracy. It can identify and track several objects or subjects of interest, incorporating many layers of information into the mix.

SURVANT extracts a predefined list of objects relevant to criminal investigations, along with their trajectories in time and space. It picks up appearance characteristics (e.g. person wearing red trousers), and for each object, can detect a list of actions (a person may walk, run, or loiter, for example). Certain predetermined crimes such as graffiti-painting, fighting, pickpocketing are automatically picked up, too.

The inference framework uses all of this information to discover high-level events such as criminal activity and/or investigative hypotheses. The system can reconstruct events in narrative form, taking into account spatial-temporal coordinates to build up a bigger picture, to track and even predict the evolution of a crime.

With an intuitive interface that needs just a couple of hours training to understand, users can search through videos and receive results produced with advanced visualisation tools. They can search for a person wearing a red blouse in a certain location, say, during a certain time period (selected by the user). Or search for anyone who is loitering next to a car.

Building on success

SURVANT is the follow up of a previous research project, ADVISE, which was co-financed by the EU. The goal here was to prove the final system in an operational environment. During the project, the team studied all aspects of video and image analysis and indexing: from object detection and tracking, to person recognition, to data protection and privacy assurance requirements.

Legal and ethical issues were also explored in detail, focussing on best practices to ensure protection of privacy and personal data – something now integrated into the system design.

“SURVANT delivers an intelligent and extensible framework for automated video analysis and analytics respectful of human rights according to European legal framework for privacy and data protection, avoiding disproportional use of personal data,” says Vella.

“The whole consortium is proud of the overall results of the project,” he adds.

Click here to have Eurasia Review's newsletter delivered via RSS, as an email newsletter, via mobile or on your personal news page.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.