Last October, I published a list of cities that appeared to have contracted with Predpol, a "predictive policing" company, based on research provided to me by an anonymous source who used clever methods to uncover the customer list.
Read the rest
Chasing down that list of potential Predpol customers reveals dozens of cities that have secretly experimented with "predictive policing"
Last October, I published a list of cities that appeared to have contracted with Predpol, a "predictive policing" company, based on research provided to me by an anonymous source who used clever methods to uncover the customer list. Read the rest
The ever-useful Gartner Hype Cycle identified an inflection point in the life of any new technology: the "Peak of Inflated Expectations," attained just before the sharp dropoff into the "Trough of Disillusionment"; I've lived through the hype-cycles of several kinds of technology and one iron-clad correlate of the "Peak of Inflated Expectations" is the "Peak of Huckster Snakeoil Salesmen": the moment at which con-artists just add a tech buzzword to some crooked scam and head out into the market to net a fortune before everyone gets wise to the idea that the shiny new hypefodder isn't a magic bullet. Read the rest
Pete Warden writes convincingly about computer scientists' focus on improving machine learning algorithms, to the exclusion of improving the training data that the algorithms interpret, and how that focus has slowed the progress of machine learning. Read the rest
Palantir Technologies is a data-mining firm that loves it some predictive policing: computer-aided sorcery that uses data models to try and predict where crimes may occur and who's got a reasonable chance of committing them.
For predictive policing to work well, the predictive model being built needs to be well fed with data on criminals, their first, second and third-person acquaintances, their social media accounts, and crime statistics for the area where the model is meant to be seeing crimes before they may possibly happen. It sounds like shit right out of Minority Report, because it kinda is – just without spooky kids in a swimming pool and a hell of a lot less accuracy.
Accurate or not, the notion of predictive policing raises a number of civil rights and privacy concerns. The ACLU isn't down with it, as the methodology of stopping someone without reasonable suspicion is against the Fourth Amendment. In their eyes, computer-aided guesses don't cut it when it comes to justifying a stop-and-frisk. China's been using it to snoop on their citizens and has been sending suspected radicals and political dissidents for re-education, just in case they decided to protest their nation's ruling party's status quo. It's creepy shit.
Anyway, back to Palantir.
Did I mention that it was started up by Peter Thiel with money seeded by the CIA? No? How about the fact that they've been running an off-the-books program with the New Orleans Police so secretive that the city's own government didn't have a clue that it was going on? Read the rest
Chelsea Manning spent seven years in federal prison for blowing the whistle on illegal actions by the US in Iraq and around the world; while imprisoned, she transitioned her gender and changed her name, and, on her release, found herself unpersoned, unable to identify herself to the satisfaction of the state, despite being one of the most famous people in America and despite the state's unquenchable thirst for our personal data (and her's especially). Read the rest
UT Austin sociologist Sarah Brayne spent 2.5 years conducting field research with the LAPD as they rolled out Predpol, a software tool that is supposed to direct police to places where crime is likely to occur, but which has been shown to send cops out to overpolice brown and poor people at the expense of actual crimefighting. Read the rest
Researchers from the Human Rights Data Analysis Group (previously) reimplemented the algorithm Predpol predictive policing system that police departments around America have spent a fortune on in order to find out where to set their patrols, and fed it Oakland's 2010 arrest data, then asked it to predict where the crime would be in 2011. Read the rest
Tim O'Reilly writes about the reality that more and more of our lives -- including whether you end up seeing this very sentence! -- is in the hands of "black boxes": algorithmic decision-makers whose inner workings are a secret from the people they affect. Read the rest
In Chicago, the "Heat List" system is used to direct policing resources, based on data-mining of social media to identify potential gang-members; the model tells the cops where to go and who to arrest, and is supposed to reduce both violent crime and the likelihood that suspects themselves will be killed -- but peer-reviewed analysis (Scihub mirror) of the program shows that while being on the Heat List increases your chances of being harassed and arrested by Chicago PD, it does not improve crime rates. Read the rest
Jan Chipchase has assembled a provocative, imaginative, excellent list of "driver behaviors in a world of autonomous mobility" that go far beyond the lazy exercise of porting the "trolley problem" to self-driving cars and other autonomous vehicles, including flying drones. Read the rest
The Ford Foundation's Michael Brennan discusses the many studies showing how algorithms can magnify bias -- like the prevalence of police background check ads shown against searches for black names. Read the rest
Santa Cruz, California police are testing prototype software that predicts where crimes may be committed in the next few days. The deputy chief of police thinks that it may help police patrol areas that aren't hotbeds of shady activity. Santa Clara University mathematician George Mohler developed the algorithm. From New Scientist:
Some crimes follow potentially predictable patterns. One burglary, for example, tends to trigger others nearby in the next few days, rather like aftershocks from an earthquake. In 2010, Mohler's team turned equations used to predict aftershocks into the basis for a program that uses the dates and times of reported crimes to predict when and where the "after crimes" will occur.
On average the program predicted the location and time of 25 per cent of actual burglaries that occurred on any particular day in an area of Los Angeles in 2004 and 2005, using just the data on burglaries that had occurred before that day…
Mohler and his colleagues will conduct a controlled experiment with the Los Angeles police department later this year. Officers will run the prediction algorithms as they do in Santa Cruz, but patrol only half of the locations it flags. They will then compare crime levels in the two groups.
"Cops on the trail of crimes that haven't happened" (New Scientist)
Santa Cruz Experimental Predictive Policing Software (UC Santa Cruz) Read the rest