Sunday, August 7, 2016

High-tech and the courtroom series: Part Four: Algorithms and bail: Philadelphia Inquirer editorial against placing too much emphasis on an algorithm designed to help judges determine which defendants should be granted bail..."Philadelphia's Adult Probation and Parole Department uses an assessment tool developed by University of Pennsylvania criminologist Richard Berk, who is leading the effort to develop an algorithm for pretrial decisions. Asked about the possibility that an algorithm could lead to discriminatory treatment of black and poor defendants, Berk said, "People who stress that point forget about the victims. How many deaths are you willing to trade for something that's race-neutral?" Other criminologists largely agree with Berk, saying their job is only to create the most accurate predictions of criminal behavior. They also argue that algorithms are more transparent and consistent than judges, who, as humans, have biases. Perhaps, but using an algorithm to determine a defendant's status prior to trial may be more detrimental than using a risk-assessment tool after a conviction. Studies show people detained before trial are more likely to be convicted and more likely to receive longer sentences. Where you live and how much you make isn't always predictive of criminal behavior. There are plenty of defendants from less-than-perfect neighborhoods who live up to the trust put in them when given the chance. Good judges, most of the time, can tell who deserves that chance, with or without an algorithm to help them.""


EDITORIAL: "Numbers may lie when setting bail, published by the Philadelphia Inquirer on July 15, 2106.

GIST: "About 60 percent of Philadelphia's prison inmates are awaiting trial, but in trying to reduce that population, officials should be careful not to put too much emphasis on an algorithm designed to help judges determine which defendants should be granted bail. The algorithm, based on the past behavior of previous inmates with similar characteristics, supposedly can calculate the likelihood that a person will commit a crime if he is released before trial. The judge can then use that calculation in deciding whether to set bail. Research suggests algorithms can be more accurate than judges in predicting behavior. Experts say the tool helps avoid unnecessarily harsh punishments for low- and medium-risk offenders. But lawyers and others have challenged the moral and legal validity of such algorithms. One glaring problem with some is their reliance on variables more strongly related to a person's race or income than their criminal background, including a defendant's zip code, education level, and leisure activities. Defendants who took IQ and reading tests during previous prison stays might see those results used against them.........Philadelphia's Adult Probation and Parole Department uses an assessment tool developed by University of Pennsylvania criminologist Richard Berk, who is leading the effort to develop an algorithm for pretrial decisions. Asked about the possibility that an algorithm could lead to discriminatory treatment of black and poor defendants, Berk said, "People who stress that point forget about the victims. How many deaths are you willing to trade for something that's race-neutral?" Other criminologists largely agree with Berk, saying their job is only to create the most accurate predictions of criminal behavior. They also argue that algorithms are more transparent and consistent than judges, who, as humans, have biases. Perhaps, but using an algorithm to determine a defendant's status prior to trial may be more detrimental than using a risk-assessment tool after a conviction. Studies show people detained before trial are more likely to be convicted and more likely to receive longer sentences. Where you live and how much you make isn't always predictive of criminal behavior. There are plenty of defendants from less-than-perfect neighborhoods who live up to the trust put in them when given the chance. Good judges, most of the time, can tell who deserves that chance, with or without an algorithm to help them."

The entire editorial can be found at:

http://articles.philly.com/2016-07-12/news/74396450_1_algorithm-criminal-behavior-prison-inmates 

PUBLISHER'S NOTE:

I have added a search box for content in this blog which now encompasses several thousand posts. The search box is located  near the bottom of the screen just above the list of links. I am confident that this powerful search tool provided by "Blogger" will help our readers and myself get more out of the site.

The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at:

http://www.thestar.com/topic/charlessmith

Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html

Please send any comments or information on other cases and issues of interest to the readers of this blog to:



hlevy15@gmail.com;

Harold Levy;

Publisher: The Charles Smith Blog;