Smarter Control For Border Patrol
As the United States expands surveillance technologies on, above and below its 1,900-mile-long border with Mexico, operating them effectively grows more challenging.
Systems and industrial engineers at the University of Arizona are building a framework for border surveillance that uses artificial intelligence, based on realistic computer simulations, to integrate data from different sources and respond in real time.
“Our goal is to devise a system to most effectively, efficiently and safely deploy border patrol resources,” said Young-Jun Son, professor and head of the UA Department of Systems and Industrial Engineering and principal investigator of the project.
With some unmanned aerial vehicles at the border starting at $18 million apiece, their performance has implications for taxpayers as well as national security.
Air Force Funding for More Focused Surveillance
Son has received a three-year, $750,000 grant from the Air Force Office of Scientific Research to build an integrated and autonomous surveillance system for land and aerial vehicles monitoring the nation’s southern border. The project began in March 2017 and continues his previous AFOSR award of nearly the same amount for work in this area.
Young-Jun Son and his co-principal investigator, UA associate professor of systems and industrial engineering Jian Liu, specialize in helping manufacturers implement smart production systems, with Son’s main expertise in computer modeling and simulation, and Liu’s in statistics and data analysis.
With the Air Force funds, the researchers are applying these skills to help the federal government — ultimately, the U.S. Department of Homeland Security’s Customs and Border Protection unit — gain a clearer picture of border activities for swifter, better-coordinated responses.
Homeland Security has used unmanned aerial vehicles equipped with cameras and radar for border surveillance since 2005. Flying at altitudes of 100 feet and far higher, the UAVs, or drones, can cover broad swaths of land and quickly detect activities that might be missed by fixed or mobile ground sensors, particularly in remote or mountainous areas.
Ground-based vehicles have their own advantages. Their sensors better detect objects on cloudy days or beneath trees and produce higher-quality images for better identifying individual objects or people.
The challenge for the UA researchers is to choose the right combination of aerial and ground vehicles, given different terrain and weather conditions, and activate them at just the right time.
“A major task of unmanned vehicles in patrol missions is to detect and find their targets’ locations in real time,” said research collaborator Sara Minaeian, a UA doctoral candidate in systems and industrial engineering. “This can be challenging for many reasons: for example, the surveillance vehicles and targets are all moving, and the landscape’s uneven nature may alter how targets appear.”
In a paper with Liu and Son published in the July 2016 edition of “IEEE Transactions on Systems, Man and Cybernetics: Systems,” she describes their novel motion-detection and geo-localization algorithms for enabling aerial and ground vehicles to work in teams to precisely locate targets and decide how to respond.
The researchers have also been analyzing and testing different wireless network technologies for drones to communicate and cooperate over varied distances.
Balancing Act
Establishing when and where to send unmanned aerial vehicles versus personnel on foot or in trucks is a delicate balancing act. Factors to consider include fuel consumption at different altitudes, accessibility, weather conditions and whether subjects may be armed.
“Once we have detected, located and identified our targets of interest, we must decide which vehicles to deploy, and how many of each, to best meet objectives while considering tradeoffs of performance, cost and safety,” Son said.
“For example, to track a group of people moving in mountainous areas under clear blue skies, the optimal solution might be to deploy six UAVs and two trucks driven by border patrol agents; whereas for monitoring a group of the same size traveling in an urban area on a cloudy day, two UAVs and six ground patrol vehicles might be more effective.”
Son’s team will also be adding aerostats, increasingly used to tracking drug traffickers’ low-flying drones and intercept traffickers, in their AFOSR simulations.
The Human Factor
Using NASA geographical data from the border, the UA researchers have written hundreds of algorithms to simulate and predict how groups of people may move when traveling on flat desert and mountains, uninhabited areas and cities, in dry, dusty conditions or during monsoons.
While the UA researchers are not doing field tests at the U.S.-Mexico border, they are conducting experiments outside the lab. They have two quadcopter drones, one purchased and the other built with off-the shelf-parts, and a ground vehicle resembling a toy car. All are remote-controlled and carry a variety of sensors.
In experiments this spring, the researchers used an aerial drone outside on the UA Mall and inside the Student Union Memorial Center to track 10 student volunteers walking in a group before randomly dispersing. They also deployed their unmanned ground vehicle to identify individual people and served as a moving landmark to prevent the UAV from losing sight of its subjects.
The researchers are using their experimental data to better understand various crowd behaviors, such as gathering and splitting, and refine their algorithms to more accurately predict and track the crowd’s movements. From experiments with a few drones and students, the researchers are scaling up their simulations models to involve hundreds of drones and thousands of people.
“We believe that by integrating multiple surveillance technologies, we can far surpass their individual capabilities,” Son said. “In our integrated system, the sum is bigger than its parts.”