Under Surveillance—Department head Young-Jun Son, second from right, and PhD students from the Department of Systems and Industrial Engineering flight-test a drone. (Photo: Chris Richards)
Smarter Control for Border Patrol UA engineers are designing an autonomous border-surveillance system to collect, assess and act on data in real time — and deploy drones on its own.
Y
YOUNG-JUN SON, head of systems and industrial engineering, has received a three-year, $750,000 grant from the Air Force Office of Scientific Research to build an integrated, autonomous surveillance system for land and aerial vehicles monitoring the nation’s southern border. The research is expected to give the U.S. Department of Homeland Security a clearer picture of activities along the 1,900-mile-long border with Mexico for swifter, better-coordinated responses. “By integrating multiple surveillance technologies, we can far surpass their individual capabilities,” said Son.
Sum Bigger Than Its Parts
surveillance team at just the right time to precisely locate and safely respond to targets. “Once we have detected, located and identified our targets of interest, we must decide which vehicles to deploy, and how many of each, to best meet objectives while considering tradeoffs of performance, cost and safety,” Son said.
Weighing the Tradeoffs Establishing when and where to send UAVs versus personnel on foot or in trucks is a balancing act. Factors considered include fuel consumption, accessibility, weather conditions and the possibility of armed smugglers.
Homeland Security’s Border Patrol unit uses unmanned aerial vehicles equipped with cameras and radar to identify suspicious activities over broad swaths of remote and mountainous areas. The unit’s fixed and mobile ground sensors are better for detecting objects and people on cloudy days or beneath trees and producing higher-quality images.
“To track a group of people moving in mountainous areas under clear blue skies, the optimal solution might be to deploy six UAVs and two trucks driven by border patrol agents,” Son explained. “For monitoring a group of the same size traveling in an urban area on a cloudy day, two UAVs and six ground patrol vehicles might be more effective.”
The challenge for the UA researchers is choosing the right combination of aerial and ground vehicles – given different terrain, weather conditions and predicted crowd movement – then activating the
The border-surveillance framework uses artificial intelligence, based on realistic computer simulations, to integrate information from different sources, including NASA geographical data.
From Simulation to AI The researchers have written hundreds of motion-detection and geolocalization algorithms to simulate and predict how groups of people move when traveling on flat desert and mountains, in uninhabited areas and cities, and in dry, dusty conditions or during monsoons. While the researchers are not fieldtesting at the U.S.-Mexico border, they are conducting experiments outside the lab – with sensorized, remote-controlled quadcopter drones, a ground vehicle model resembling a toy car, and human volunteers – to help understand and predict crowd behavior, such as gathering and splitting. Son’s team is also including aerostats in their simulations – blimps with radar used to detect low-flying aircraft and drones carrying Young-Jun Son drugs across the border. And the researchers are analyzing and testing wireless network technologies for the surveillance drones to communicate and cooperate over varied distances. 40:2
Fall 2017
|
5