Friday, March 23, 2018

Predictive policing: (2) Palantir: Writer/blogger Nicole Lindsey (CPO Magazine) focuses on "important privacy and human rights" raised by predictive policing - and on the role played by the giant data-mining company Palantir in New Orleans..."It almost sounds like a scene out of the Hollywood blockbuster movie “Minority Report” – police departments around the world, from Los Angeles to China, are embracing new predictive policing technology that will help them spot criminals before a crime ever takes place. However, communities often have little or no idea of why or how this technology is being used, and that raises some important privacy and human rights concerns. What data, exactly, are police departments using, and how is that impacting the way they do their job." Palanir Palantir: Predictive policing as the future of law enforcement


PASSAGE  OF THE DAY: "One of the biggest test cases took place in New Orleans, where predictive policing utilizing sophisticated data mining tools from Silicon Valley’s Palantir was able to uncover ties to other gang members, outline extensive criminal histories, and even flag individuals who might become future gang members. Needless to say, the arrests went up, the prosecutions went up, and the New Orleans Police Department won acclaim for its policing efforts. However, there were a few problems here from a privacy perspective. First and most importantly, someone forgot to tell New Orleans city council members or members of the local community. As one politician notes, “No one in New Orleans even knows about this.” And there was a good reason for this – the Palantir initiative was budgeted as a “philanthropic venture,” so there was no public vetting of the program. It flew under the radar without setting off any privacy fears. And there were other troubling problems with this Palantir predictive policing experiment in New Orleans. For example, it tended to have an outsized impact on poor communities of color. Moreover, as experts now point out, the Palantir experiment had the very potential to sweep up innocent people who are related to criminals through several degrees of separation."

STORY: "Predictive Policing Raises Important Privacy and Human Rights," by Nicole Lindsey, published by CPO Magazine on March 16, 2018. Nicole Lindsey is a writer and blogger for more than 10 years, focusing on the intersection of technology, innovation and privacy. She has a background in information technology and has worked with various software companies and tech startups on their public relations and communications initiatives. CPO self description: " We provide news, insights and resources to help data privacy, protection and cyber security leaders make sense of the evolving landscape to better protect their organizations and customers. CPO Magazine is a website owned by Data Privacy Asia Pte. Ltd.... a company registered in Singapore providing knowledge products positioned at the intersection of data privacy, protection and cyber security and serves as the focal point for professionals to learn, network and collaborate."

GIST: It almost sounds like a scene out of the Hollywood blockbuster movie “Minority Report” – police departments around the world, from Los Angeles to China, are embracing new predictive policing technology that will help them spot criminals before a crime ever takes place. However, communities often have little or no idea of why or how this technology is being used, and that raises some important privacy and human rights concerns. What data, exactly, are police departments using, and how is that impacting the way they do their job. The current predictive policing initiatives can be traced back to the mid-2000s, when the goal was to anticipate, prevent and reduce crime, not just respond to crime. The idea was that the use of several cutting-edge crime forecasting data analysis tools – everything from data mining to geospatial prediction – could be used very effectively by law enforcement officials for deploying resources more effectively and predicting crimes. There was nothing particularly attention-getting about these early initiatives from a privacy perspective, mainly because they were seen as just an extension of what law enforcement agencies had already been doing for decades. For example, police departments around the world know that certain events – such as New Year’s Eve celebrations – can require additional policing efforts. So why not make use of data already on hand to “predict” potential hot spots and prevent crime before the trouble ever takes place? And sometimes this data analysis uncovered some unknown patterns and trends. For example, one predictive policing initiative in Texas discovered an unknown link between burglaries and housing code violations. “Fragile neighborhoods” where housing was sub-standard were suddenly flagged for greater police resources, and that led to a reduction in crime. Just by being more visible in these neighborhoods, police could send a warning signal to potential criminals. The explosion of non-traditional data available for predictive policing: But something very interesting started to happen around 2009 – the big tech companies of Silicon Valley started to get involved. The “Big Data” trend was just underway, and police departments around the world started to realize that they had a wealth of information and non-traditional data that they could tap into as part of their new predictive policing efforts. For example, “social network analysis” suddenly became a powerful tool in the hands of police departments. Just by knowing whom criminals were talking to on social media, police officers could start to piece together some very intricate criminal networks. Palantir and the privacy issues raised by predictive policing. One of the biggest test cases took place in New Orleans, where predictive policing utilizing sophisticated data mining tools from Silicon Valley’s Palantir was able to uncover ties to other gang members, outline extensive criminal histories, and even flag individuals who might become future gang members. Needless to say, the arrests went up, the prosecutions went up, and the New Orleans Police Department won acclaim for its policing efforts. However, there were a few problems here from a privacy perspective. First and most importantly, someone forgot to tell New Orleans city council members or members of the local community. As one politician notes, “No one in New Orleans even knows about this.” And there was a good reason for this – the Palantir initiative was budgeted as a “philanthropic venture,” so there was no public vetting of the program. It flew under the radar without setting off any privacy fears. And there were other troubling problems with this Palantir predictive policing experiment in New Orleans. For example, it tended to have an outsized impact on poor communities of color. Moreover, as experts now point out, the Palantir experiment had the very potential to sweep up innocent people who are related to criminals through several degrees of separation. For example, a cousin of a known drug dealer might be called in for questioning – despite having no criminal background and no reason to be suspected, other than casual social connections via Facebook. This is a clear invasion of privacy."

The entire story can be found at:

https://www.cpomagazine.com/2018/03/16/predictive-policing-raises-important-privacy-and-human-rights-concerns/

PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. Harold Levy; Publisher; The Charles Smith Blog.