Though crime happens everywhere, predictive policing tools send cops to poor/black neighborhoods

Researchers from the Human Rights Data Analysis Group (previously) reimplemented the algorithm Predpol predictive policing system that police departments around America have spent a fortune on in order to find out where to set their patrols, and fed it Oakland's 2010 arrest data, then asked it to predict where the crime would be in 2011.

Predpol's algorithm munged the arrest data, then confidently asserted that the Oakland PD should concentrate the bulk of their resources in a neighborhood that is poor and black. However, data from the census and the National Drug Use and Health Survey show that crime occurred across Oakland, meaning that if the Oakland PD had followed Predpol's advice in 2011, they would have just gone and rounded up and jailed a bunch of black people (remember, 97% of the people indicted in the USA plead guilty, an impossibly high number that guarantees that innocent people are pleading guilty to escape the extreme sentences available to prosecutors who secure a conviction at trial).

The reason that Predpol's model predicts that nearly all the crime would occur in a these neighborhoods is that police concentrate policing here, and you can only find crime in places where you look for it. The algorithm distills the bias in the input data. Unsurprisingly, these are neighborhoods predominantly populated by low-income people of color. Predpol and tools like it are sold as data-driven ways to overcome this kind of police bias, but really, they're just ways of giving bias a veneer of objective responsibility.

Oakland Mayor Libby Schaaf has repeatedly sought an appropriation of $150,000 to buy Predpol for the city.

Other cities are dumping Predpol. In Burbank, where I live, the police got rid of Predpol after it lowered officer morale to the point where 75% of Burbank cops had "low or extremely low" morale.

Predpol is a classic weapon of math destruction in that it creates a model without regard to bias in data — every scientist and statistician knows that sampling bias is a deadly pitfall in any kind of statistical analysis. Then it predicts the future based on that biased data, and directs those in authority to act on those predictions in a way that is guaranteed to show the predictions to have been correct (regardless of whether they are, in fact, correct), and then it re-ingests data from the behavior dictated by the biased predictions, and suggests behavior that produces even more biased outcomes.

To top it off, a bad prediction by the algorithm causes black people to be overpoliced, white people to be underpoliced, exacerbates the problem of coerced guilty pleas, and is a pipeline that feeds into the equally racially biased automated sentencing systems that send black people to prison for longer than white people.

To evaluate the fairness and efficacy of predictive crime algorithms, they would need to be audited by outside parties. But most predictive police technology exists in a black box of private sector trade secrets; systems that should be up for public scrutiny are outsourced to private companies like PredPol that don't have to disclose their algorithms for a public audit. The only way researchers were able to use the software in this case was to pull a version of the algorithm from one of PredPol's own, published studies.

"If predictive policing means some individuals are going to have more police involvement in their life, there needs to be a minimum of transparency," Adam Schwartz, a senior staff attorney with the Electronic Frontier Foundation, said in an interview "Until they do that, the public should have no confidence that the inputs and algorithms are a sound basis to predict anything."


Schwartz pointed out that in some states, such as Illinois, there are legal prohibitions on adopting systems that have a racially-disparate impact. Without being able to evaluate predictive policing systems, and strong laws in place to prevent police technology from amplifying the worst biases in police work, he says predictive policing isn't ready for actual police use.

"What we want for police to do is not to be putting in place new systems of predictive policing until a lot more study is done and safeguards are put in place," Schwartz said. "Frequently these systems shouldn't be adopted at all."

(Exclusive) Crime-prediction tool PredPol amplifies racially biased policing, study shows
[Jack Smith IV/Mic]