The Eye in the Sky drone surveillance system from Cambridge University researchers that can spot violent individuals
PUBLISHED: 17:57 20 June 2018
Artifical intelligence-powered technology could be used to aid policing of crowds
A drone surveillance system that can identify violent individuals has been developed by University of Cambridge researchers.
Powered by artificial intelligence, Eye in the Sky could be used by police and security services to spot and track offenders within crowd scenes in real-time.
Amarjot Singh, of the university’s Department of Engineering, told the Cambridge Independent: “The AI-based drone surveillance system is unique and can identify violent individuals from aerial videos at large public gatherings such as marathons, music festivals, etc. This system is especially useful for such events as it is not practical to place enough cameras to surveil such crowds.
“We are hoping that the drone can detect the violent individuals, which can be sent to an operator for further inspection. Once the operator confirms that the activity needs attention, the police can go and investigate the matter. This can act as a prescreening tool for security agencies.”
He added: “The Boston Marathon bombings in 2013 and the Manchester Arena bombing in 2017 inspired me to develop this system. The current CCTV type surveillance systems were not enough to identify these attackers in time. This is due to the limited field of view of the CCTV camera which makes it possible for aggressive individuals to avoid detection. A surveillance system mounted on the drone is likely to capture these individuals due to its large field of view.
“Attacks like these could be prevented in future if surveillance cameras can automatically spot suspicious behavior, like someone leaving a bag unattended for a long period. We are extending the system to detect illegal border crossing, vandals and people involved in communal riots, as well as the detection of child kidnappers.”
The system first identifies humans within a scene. It was able to identity 10,558 humans out of 10,863 in test footage (97.2 per cent).
Then it uses a ‘ScatterNet Hybrid Deep Learning (SHDL) network’ to monitor the angles between their limbs via 14 key points, and identify violent activity.
The researchers taught the system to identify five actions –punching, kicking, strangling, shooting and stabbing.
Its accuracy at doing so varied from 82 per cent for shooting to 94 per cent for kicking, although the success rate declined as more people were featured.
To create their test footage – called the ‘aerial violent individual dataset’, the researchers used 25 volunteers to act out violent scenes featuring between two and 10 people at a time. The footage was captured using a Parrot AR drone featuring two cameras, flying two to eight metres away.
“The crowd size we tested on is rather small and more indicative of CCTV scenarios,” said Amarjot. “However, the system was very successful and can detect the violent individuals in real-time.
“We are extending the system so that it can run on more dense crowds. We will be conducting one such test in October in NIT Warangal, India, where the drone system will be tested on a group of 3,000-4,000 people. Once the system can do well in the test runs we will be bringing it to market. We are also planning to extend this system to monitor the borders of India.”
Amarjot, a PhD student in deep learning and object recognition under Prof Nick Kingsbury in the Signal Processing and Communications Laboratory, co-authored a paper on the work with Devendra Patil, of the National Institute of Technology in Warangal, and SN Omkar of the Indian Institute of Science in Bangalore.