Flawed data complicates criminal justice AI
- By Chase Gunter
- Jun 08, 2016
Artificial intelligence has the potential to help reform policing and criminal justice practices nationwide, experts say. However, one challenge to deploying widespread artificial intelligence is refining the data to avoid reinforcing historic biases.
The White House and the University of Chicago have teamed up with police departments across America to "start fixing" biases with the data collected over the years by the criminal justice system by building new ways of looking at the data, said director of the Center for Data Science and Public Policy at the University of Chicago Riyad Ghani. "We think that AI and machine learning… combined with all the data that exists can help solve these problems," Ghani said.
The United States locks up more people per capita than any other nation, and artificial intelligence and machine learning can create predictive modeling and find patterns in behaviors of the officers and those being arrested, explained Ghani. However, current policing data shows a multitude of discrepancies and biases.
"I'm all for artificial intelligence and machine learning, but I'm afraid we're not ready for it in the criminal justice space until we really and truly clean up the data," said Roy Austin, director of the White House Office of Urban Affairs, Justice and Opportunity.
"The system does not affect people in the same way," said Lynn Overmann, senior policy adviser at the White House Office of the CTO. "At every stage in our criminal justice system, African-Americans and Hispanics are more likely to be stopped, they're more likely to be searched for evidence, they're more likely to be convicted, and when they are convicted, they tend to get longer sentences."
Trust in the system relies on fair treatment and justice, and inputting flawed data will return results that "reflect the system as we see it now, not how we want to see it," she added.
Furthermore, much of the data that law enforcement officials work with is incomplete.
Every year, the Uniform Crime Report outlines what crime in America looks like statistically based on arrest records. Additionally, the National Crime Victimization Survey is conducted to build a comparative crime index.
Based on discrepancies between the two, only a fraction of robberies, assaults and crimes of sexual violence "are even reported to the police," and arrests are made in only 65 percent of homicides, said Austin.
"This is the data that we're supposed to input into a machine to tell us" how to prevent and react to crime, Austin said. "So if we're talking about machine learning and artificial intelligence -- we all know about garbage-in, garbage-out -- the problem is the amount of bad information that we would have to feed into that machine, to learn what?"
The flaws in the data touch on pervasive social issues, including trust fractured by racial and socioeconomic prejudices and use of excessive force. However, the criminal justice system is not doomed to reinforce existing biases, and initiatives that deploy artificial intelligence are in motion to address these problems.
The Police Data Initiative was enacted in 2015 to help rebuild trust in law enforcement by collecting data and increasing transparency.
One example of the effort to rebuild trust is the expanded use of body-worn cameras by police officers. The cameras capture what policing looks like and give law enforcement the opportunity to review the footage to determine upstanding and inappropriate policing practices, as well as what circumstances or personnel may be more susceptible to violent situations.
"The initial results are very promising," Overmann said. "The use of force reports went down, and the use of force complaints went down."
However, the cameras still face technical problems. They collect more hours of footage than departments have time to comb through, and the audio can be distorted by interference from wind, traffic and other noisy situations.
Artificial intelligence has a lot of promise, but at its core, policing is tough, and human error will always exist, Overmann said.
"What we ask our police officers to do everyday is an extraordinarily hard job," she said. "How do we help these officers cope with the things they deal with so they themselves are safe...and they can take care of the community more effectively?"
Artificial intelligence and data transparency can also reveal information about who is being arrested, and help authorities understand how to treat people with possible mental and substance abuse issues rather than cycling them through the criminal justice system.
Overmann said solving the problems with criminal justice data will require "deep and close relationships between the researchers who are trying to develop new technologies and the people who we actually need to use them to make these things effective in some of our most critical areas."
Chase Gunter is a staff writer covering civilian agencies, workforce issues, health IT, open data and innovation.
Prior to joining FCW, Gunter reported for the C-Ville Weekly in Charlottesville, Va., and served as a college sports beat writer for the South Boston (Va.) News and Record. He started at FCW as an editorial fellow before joining the team full-time as a reporter.
Gunter is a graduate of the University of Virginia, where his emphases were English, history and media studies.
Click here for previous articles by Gunter, or connect with him on Twitter: @WChaseGunter