This Article analyzes the societal and cultural impacts of greater reliance on the use of algorithms in the courtroom. Big-data analytics and algorithms are beginning to play a large role in influencing judges’ sentencing and criminal enforcement decisions. This Article addresses this shift toward greater acceptance of algorithms as models for risk-assessment and criminal forecasting within the context of moral and social movements that have shaped the American justice system’s current approach to punishment and rehabilitation. By reviewing salient problems of scientific uncertainty that accompany the use of these models and algorithms, the Article calls into question the proposition that greater reliance on algorithms in the courtroom can lead to a more objective and fair criminal sentencing regime.
Far from liberating the society from the biases and prejudices that might pollute judges’ decision-making process, these tools can intensify, while simultaneously concealing, entrenched cultural biases that preexist in the society. Using common themes from the field of Science and Technology Studies (STS), including boundary-work analysis and Public Understanding of Science (PUS), this Article highlights unique technical characteristics of big-data analytics and algorithms that feed into undesirable and deeply-held values and beliefs. This Article draws attention to specific gaps in technical understanding of algorithmic thinking, such as the black box of algorithms, that can have discordant impact on communicating uncertainty to the populace and reduce accountability and transparency in regulating the use of algorithms. This Article also provides specific policy proposals that can ameliorate the adverse social and cultural effects of incorporating algorithms into the courtroom. The discussion of policy proposals borrows from the STS literature on public participation in science and encourages adoption of a policy that incorporates diverse voices from political actors, most affected communities, and the offenders themselves.