It has been part of a revolution in the criminal justice system. Decisions are not made based upon judgement but statistically based.
While the catch phrase “evidenced based” is hard to argue with, the challenge is, perhaps, that there is junk science driving the evidence.
Judgments can suffer from implicit bias.
But then there is this. Melissa Hamilton (University of Surrey School of Law) has posted The Biased Algorithm: Evidence of Disparate Impact on Hispanics (56 AM. CRIM L. REV. Forthcoming) on SSRN. Here is the abstract:
Algorithmic risk assessment holds the promise of reducing mass incarceration while remaining conscious of public safety. Yet presumptions of transparent and fair algorithms may be unwarranted. Critics warn that algorithmic risk assessment may exacerbate inequalities in the criminal justice system’s treatment of minorities. Further, calls for third party auditing contend that studies may reveal disparities in how risk assessment tools classify minorities. A recent audit found a popular risk tool overpredicted for Blacks.
An equally important minority group deserving of study is Hispanics. The study reported herein examines the risk outcomes of a widely used algorithmic risk tool using a large dataset with a two-year followup period. Results reveal cumulative evidence of (a) differential validity and prediction between Hispanics and non-Hispanics and (b) algorithmic unfairness and disparate impact in overestimating the general and violent recidivism of Hispanics.
Judge Burke,
This is a really important. I cannot say I am surprised. The risk assessment tool may reveal a bias, and overprediction for minorities.
Thank you so much for the information!!
Judge Tracy Brandeis-Roman
Philadelphia, PA
Court of Common Pleas of Pennsylvania, Philadelphia County, Criminal Division.
LikeLike