When it Comes to Justice, Algorithms are Far From Infallible
by Erika Posey
Early on in Tuesday’s confirmation hearing, Neil Gorsuch suggested that the judiciary may be in danger of automation. When asked how political ideology can affect judicial decision-making, Judge Gorsuch joked that “they haven’t yet replaced judges with algorithms, though I think Ebay is trying, and maybe successfully.” The joke fell flat, but Judge Gorsuch isn’t completely wrong – though Ebay doesn’t seem to have anything to do with it.
Algorithms already play a role in courtrooms across the nation. “Risk assessment” software is used to predict whether or not an offender is likely to commit crimes in the future. The software uses personal characteristics like age, sex, socioeconomics, and family background to generate a risk score that can influence decisions about bail, pre-trial release, sentencing, and probation. The information fed into the system is pulled from either defendant surveys or criminal records.
Algorithms also help determine who ends up in the courtroom in the first place. Police are investing in “predictive policing” technology — powerful software that uses data on past crime to forecast where, when, and what crimes might occur. Police use the predictions to make deployment decisions. Some software even claims to predict who may be involved in a future crime. A pilot program in Chicago used software to identify roughly 400 people likely at high risk of being involved in violent crime in the next year. Law enforcement notified the individuals and followed up with them in an attempt to cut the city’s crime rate. Facial recognition algorithms are already used with surveillance footage, and emerging technology will allow real-time facial recognition with police body cameras.
Proponents of the tools laud the software’s potential to cut costs, drive down prison populations, and reduce bias in the criminal justice system. The expensive and prejudicial outcomes of our human-driven criminal justice system are well documented. As Judge Gorsuch lamented, “I’m not here to tell you I’m perfect. I’m a human being, not an algorithm.”
Unfortunately, the algorithms aren’t perfect either. A ProPublica analysis of a widely-used risk assessment algorithm found that only 20% of people the software predicted would commit violent crimes went on to do so in the two years after the assessment was conducted. When all crimes – including misdemeanors – were taken into account, the algorithm was only slightly more accurate than a coin flip in predicting recidivism rates. Worse still, it was nearly twice as likely to mislabel black defendants as high risk than white defendants.
For the full article, go here.