Algorithmic risk assessment holds the promise of reducing mass incarceration while remaining conscious of public safety. Yet presumptions of transparent and fair algorithms may be unwarranted. Critics warn that algorithmic risk assessment may exacerbate inequalities in the criminal justice system’s treatment of minorities. Further, calls for third party auditing contend that studies may reveal disparities in how risk assessment tools classify minorities. A recent audit found a popular risk tool overpredicted for Blacks.
An equally important minority group deserving of study is Hispanics. The study reported herein examines the risk outcomes of a widely used algorithmic risk tool using a large dataset with a two-year followup period. Results reveal cumulative evidence of (a) differential validity and prediction between Hispanics and non-Hispanics and (b) algorithmic unfairness and disparate impact in overestimating the general and violent recidivism of Hispanics.