Judge Delores (Dottie) Rickman was known as a thoughtful judge who treated each case she heard with sensitivity and common sense. Her sentencing decisions were remarkable in their effectiveness. She had an innate knowledge of how to use sentencing as a rehabilitation tool or as a crime prevention tool. She could make uncanny judgements about persons for when prison would lead to straightening out their lives. For those her sentences were lighter. She also could sense those persons for whom prison would make them more determined to lead a criminal life. For those she would impose harsher sentences. All of her sentencing decisions were within sentencing guidelines.
But Judge Dottie found herself no longer able to make informed sentencing decisions. She was mandated to use a sentencing algorithm. The algorithm used such factors as prior arrest history, residential stability, employment status, drug use, and education level. The actual algorithm was considered a trade secret.
Judge Dottie was especially concerned about the sentence the algorithm gave to a 22 year old African-American male. She would have given him a much lighter sentence. She decided to experiment with the algorithm. She removed from the data fed to the algorithm a prior arrest when the person was a juvenile. The case involved shop lifting of a loaf of bread for his family. When the prior arrest was removed, the sentence was reduced by 5 years.
Judge Rickman decided to do research on the track record of the algorithm. Here is what she found:
- African-Americans were twice as likely to receive longer sentences but did not commit a crime when released from jail.
- Only 20 percent of the persons who were predicted to commit violent crimes went on to actually commit a violent crime.
- Poverty was an underlying component in nearly every factor used in the algorithm.
Judge Dottie gave the sentence she thought best. As a result, she was placed on judicial probation. She decided to retire.
Are algorithms used in sentencing without human oversight democratic? Do these algorithms meet the One Process and Equal Protection clauses of the 14th Amendment? Does the secrecy of the algorithm provide for due process when the sentencing process itself is hidden from public scrutiny? Research has shown that the algorithm has an underling bias against those persons who grew up in poverty. Is that equal protection?
The algorithm was not developed through a democratic process. It was developed by technocratic elite. The way the sentencing algorithm was administered took away the ability of the judge to make an informed decision.
Just imagine how sentencing decisions could be democratic? Could respected senior judges be used to mentor younger judges in sentencing judgements? Just imagine how sentencing decisions of judges could be evaluated on offender outcomes after the prison term is over? Could such reviews be a better source of information for algorithms? Just imagine how sentencing decisions might be improved if they were based upon principles and values rather than data and predictive models? Finally, just imagine how decisions which could have a major impact on a person’s life could become an example of democracy?
* * *
“Algorithms don’t do a good job of detecting their own flaws.” – Clay Shirky (Nobel Prize winner in Economics)