Study Finds No Evidence Of Racial Bias In Predictive Policing

By

While predictive policing aims to improve the effectiveness of police patrols, there is concern that these algorithms may lead police to target minority communities and result in discriminatory arrests. An IUPUI School of Science computer scientist conducted the first study to look at real-time field data from Los Angeles, CA and found predictive policing did not result in biased arrests.

“Predictive policing still is a fairly new field. There have been several field trials of predictive policing where the crime rate reduction was measured; but there have been no empirical field trials to date looking at whether these algorithms, when deployed, target certain racial groups more than others and lead to biased stops or arrests,” said George Mohler, associate professor of computer and information science at the School of Science at IUPUI.

Mohler, along with researchers at UCLA and Louisiana State University, worked with the Los Angeles Police Department to conduct the experimental study. A human analyst made predictions on where officers would patrol each day and an algorithm also made a set of predictions; it then was randomly selected which set was used by officers in the field each day.

The researchers measured the difference in arrest rates by ethnic groups between the predictive policing algorithm and hotspot maps created by LAPD analysts that were in use prior to the experiment.

“When we looked at the data, the differences in arrest rates by ethnic group between predictive policing and standard patrol practices were not statistically significant,” Mohler explained.

The study examined data both at the district level and within the LAPD officers’ patrol areas and found there was no statistically significant difference between arrest rates by ethnic group at either geographical level. Finally, researchers looked at arrest rates overall in patrol areas and found that those were statistically higher in the algorithmnically-selected areas, but when adjusted for the higher crime rate in these areas, the arrests were lower or unchanged. “The higher crime rate, and proportionally higher arrest rate, is what you would expect since the algorithm is designed to identify areas with high crime rates,” says Mohler.

Mohler notes that in the developing field of predictive policing, there continue to be lessons learned from each study and implementation. A recent simulation study of predictive policing with drug arrest data from Oakland, CA showed there is potential for bias when these algorithms are applied in certain contexts. Mohler hopes the Los Angeles study is a starting point to measure predictive policing bias in future field experiments.

“Every time you do one of these predictive policing deployments, departments should monitor the ethnic impact of these algorithms to check whether there is racial bias,” Mohler said. “I think the statistical methods we provide in this paper provide a framework to monitor that.”

Leave a Reply

Your email address will not be published. Required fields are marked *