Predictive policing predicts police harassment, not crime

In Chicago, the "Heat List" system is used to direct policing resources, based on data-mining of social media to identify potential gang-members; the model tells the cops where to go and who to arrest, and is supposed to reduce both violent crime and the likelihood that suspects themselves will be killed — but peer-reviewed analysis (Scihub mirror) of the program shows that while being on the Heat List increases your chances of being harassed and arrested by Chicago PD, it does not improve crime rates.

In the paper, published in the Journal of Experimental Criminology, Rand Corporation researchers conclude that "once other demographics, criminal history variables, and social network risk have been controlled for using propensity score weighting and doubly-robust regression modeling, being on the SSL did not significantly reduce the likelihood of being a murder or shooting victim, or being arrested for murder" but "individuals on the list were people more likely to be arrested for a shooting regardless of the increased contact."

In other words, predictive policing predicts the police, not the crime. Moreover, as is so often the case, racist training data produces racist predictive models, which allow racist institutions to claim to be undertaking objective and neutral measures while continuing to be totally racist.

The finding that the list had a direct effect on arrest, rather than victimization, raises
privacy and civil rights considerations that must be carefully considered, especially for
predictions that are targeted at vulnerable groups at high risk of victimization. Both
local and national media openly ask whether the CPD SSL pilot constitutes racial
profiling (Erbentraut
2014; Llenas
2014; Stroud
2014). A review of the legal and
constitutional issues involved in using predictions for criminal justice purposes notes
that, while it is not legal to use protected classes as predictors (Starr
2014), classifications that have differential impact on different protected classes, such as racial groups,
that are not designed to have this impact, are legal (Tonry
1987 ). Tonry (1987)also
argues that the ethical issues with using prediction in a policing context are less
controversial than in other criminal justice settings because they are necessary for the
cost-effective distribution of scarce resources, and their decisions will ultimately be
reviewed by impartial judges before punishment is delivered. However, using predictions to identify individuals in the community for increased police scrutiny has not
been subject to judicial review.

Predictions put into practice: a quasi-experimental evaluation of Chicago's predictive policing pilot

[Jessica Saunders, Priscillia Hunt and John S. Hollywood/Journal of Experimental Criminology] [Scihub mirror]

Chicago's "Heat List" predicts arrests, doesn't protect people or deter crime
[Cathy O'Neil/Mathbabe]