Skip to main content

Justice By Algorithm? Computerized Sentencing Programs Yield Controversy Without Accountability

 

A machine, a robot, a computer program that you never knew existed and likely can’t ask about–these are quickly replacing the people who make key choices in the criminal justice system. Software algorithms protected by trade secrets legislation morph into the decision-makers we rely upon to decide which humans go into the state’s detention facilities and which humans come out of them.

If those sentences sound like an unlikely mishmash of techno-babble and legal-speak, you couldn’t be faulted for skepticism. It all rings a bit too dark, a bit too scary–even extravagantly perverse across multiple definitions of the term. But a recent exposé in the Washington Monthly shows this isn’t some possible future or speculative fiction, it’s already here–and it’s unfortunately all too real.

The following paragraph from the report offers a small glance at the state of crime and punishment in our late capital-induced techno-present-and-future:

“Proprietary algorithms are flooding the criminal justice system. Machine learning systems deploy police officers to ‘hot spot’ neighborhoods. Crime labs use probabilistic software programs to analyze forensic evidence. And judges rely on automated ‘risk assessment instruments’ to decide who should make bail, or even what sentence to impose.”

As it turns out, these systems can make mistakes that cost people their freedom.

Take, for example, Glenn Rodríguez, the main subject of of the report. Rodríguez had been found guilty of second-degree murder at the age of sixteen, but after twenty-six years in prison, was a model prisoner and appeared to be fully rehabilitated. Yet, he got slapped down by the parole board last July due to a single metric on the COMPAS system–a metric that was eventually admitted to be incorrect.

The ACLU’s Rachel Goodman tweeted yesterday:

LawNewz.com reached out to Goodman for additional comment, and is awaiting a response.

Originally developed by Northpointe, Inc., COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) is now produced by Equivant, a software firm based out of Dayton, Ohio. The system is used to determine initial sentencing as well as the likelihood of recidivism in any given case. But aside from its reliance on standardized survey results–some self-reported, others filled in by an evaluator–the exact weighting and scoring is kept secret, so there are few, if any, ways to challenge a COMPAS determination.

Presently, the Supreme Court is taking a look at the use of COMPAS in sentencing proceedings in Loomis v. Wisconsin. The petitioner, Eric Loomis, was rated “high risk” after pleading no contest to evading a traffic stop and using a car without the owner’s consent. He got six years in prison because of his COMPAS score. Loomis’ suit alleges that this was a violation of his constitutional due process rights.

Further stultifying the ability of the algorithms to achieve justice is the fact that they mostly aren’t government creations at all–just corporate products hawked to police departments and prison systems as a shortcut designed to make their jobs easier.

As Rebecca Wexler writes for the Washington Monthly:

“[P]rivate companies increasingly purport to own the means by which the government decides what neighborhoods to police, whom to incarcerate, and for how long. And they refuse to reveal how these decisions are made—even to those whose life or liberty depends on them.”

This means no transparency. No follow-ups. No public control over the most basic issues that strike at the heart of how law, liberty, and justice are disposed of in these United States. Of course, that’s about par for the American legal system, but the move toward tech-based solutions is being sold as some sort of corrective.

And, of course, these new systems do have their defenders: those who use them and those who produce them for profit. Typically liberal in their justifications, supporters claim that relying on technology to reinvigorate a system beyond reform will result in real life alchemy. But, it turns out, the algorithms’ appear to be quite racist–that, is they are disproportionately impacting black people. And, according to an analysis by public interest firm ProPublica, COMPAS is among the worst offenders.

The perpetual discord between advertising and reality with technology-induced salvation could be chalked up to the intrinsic ability of American liberals and tech-libertarians (who are basically liberals) to forever elide the difference between personal bigotry and structural aspects of institutions with racist outcomes. In any event, Minority Report-like policing and imprisoning is here and with us now. And “predictive” tactics are increasingly being used with predictably race-focused effects by trenchant civil rights organizations like the NYPD and LAPD.

These dystopian methods are doing more harm than good, and Americans deserve better. Anything more than just a passing concern for justice demands we do much better. Transparency and genuine understanding are not just buzz words; they’re indispensable.

[Image via Vintage Tone/Shutterstock]

Follow Colin Kalmbacher on Twitter: @colinkalmbacher

This is an opinion piece. The views expressed in this article are those of just the author.

Filed Under:

Follow Law&Crime: