Letting Big Data Be the Judge May Be Criminal

V1 logo

Society

Letting Big Data Be the Judge May Be Criminal Ben McDermott
April 9, 2018

Technology used to predict if a convicted criminal will reoffend is coming under attack. Some researchers suggest it’s just about as accurate as a poll of untrained people.

Once a criminal, always a criminal? The technology used to predict recidivism, the tendency of a convicted criminal to reoffend, is coming under scrutiny for its risk assessment capabilities.

One of the most popular programs used to assess offenders is the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS). COMPAS uses 137 factors such as criminal history, gender and age to predict the likelihood that someone will commit a crime within two years. It has been used to assess over one million people since 1998, and has become the focus of a pivotal point of contention.

Researchers Julia Dressel and Hany Farid of Dartmouth College have found that COMPAS can only predict recidivism as well as an online poll of untrained laypeople. Dressel and Farid – whose paper was published in Science Advances in January 2018 – asked around 400 participants sourced from an online marketplace to decide on whether selected, real defendants would reoffend based off of their sex, age and previous criminal history.    

Incredibly, the online poll of people predicted recidivism with an average accuracy of 67 percent, compared to COMPAS’ 65 percent. “There was essentially no difference between people responding to an online survey for a buck and this commercial software being used in the courts," says Farid. "If this software is only as accurate as untrained people responding to an online survey, I think the courts should consider that when trying to decide how much weight to put on them in making decisions.”

This isn’t the first time that COMPAS has come under attack. In May 2016, the Pulitzer Prize-winning news organization, ProPublica, released a story claiming the software is “biased against blacks.” Having studied COMPAS’ risk assessments for over 10,000 people, the report found that, while the software “correctly predicted recidivism for black and white defendants at roughly the same rate,” blacks were “almost twice as likely as whites to be labeled a higher risk but not actually re-offend.”    

Algorithms like COMPAS process data in 0’s and 1’s, but justice is not binary. As long as the legal system continues to use risk assessment programs, it’s important to find out if these algorithms are making the right kind of difference. The jury is still out, but ironically, the data suggests no.

Related Posts:
V1. Editions: 
Society

Join the V1. family of subscribers and discover a better way to work!

FREE BONUS REPORT: A New Generation of Work
Password requires 8 characters minimum
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.