After learning that Amazon was pushing the use of Rekognition, its facial recognition tool, for use in policing (a global phenomenon that is gaining momentum despite the material unsuitability of these tools in policing contexts), the ACLU of Northern California had a brainwave: they asked Rekognition to evaluate the faces of the 115th Congress of the United States. Read the rest
In 2009, JP Morgan Chase's "special ops" guy was an ex-Secret Service agent called Peter Cavicchia III, and he retained Palantir to spy on everyone in the company to find "insider threats"; even getting the bank to invest in Palantir. Read the rest
Palantir Technologies is a data-mining firm that loves it some predictive policing: computer-aided sorcery that uses data models to try and predict where crimes may occur and who's got a reasonable chance of committing them.
For predictive policing to work well, the predictive model being built needs to be well fed with data on criminals, their first, second and third-person acquaintances, their social media accounts, and crime statistics for the area where the model is meant to be seeing crimes before they may possibly happen. It sounds like shit right out of Minority Report, because it kinda is – just without spooky kids in a swimming pool and a hell of a lot less accuracy.
Accurate or not, the notion of predictive policing raises a number of civil rights and privacy concerns. The ACLU isn't down with it, as the methodology of stopping someone without reasonable suspicion is against the Fourth Amendment. In their eyes, computer-aided guesses don't cut it when it comes to justifying a stop-and-frisk. China's been using it to snoop on their citizens and has been sending suspected radicals and political dissidents for re-education, just in case they decided to protest their nation's ruling party's status quo. It's creepy shit.
Anyway, back to Palantir.
Did I mention that it was started up by Peter Thiel with money seeded by the CIA? No? How about the fact that they've been running an off-the-books program with the New Orleans Police so secretive that the city's own government didn't have a clue that it was going on? Read the rest
Researchers from the Human Rights Data Analysis Group (previously) reimplemented the algorithm Predpol predictive policing system that police departments around America have spent a fortune on in order to find out where to set their patrols, and fed it Oakland's 2010 arrest data, then asked it to predict where the crime would be in 2011. Read the rest
In Chicago, the "Heat List" system is used to direct policing resources, based on data-mining of social media to identify potential gang-members; the model tells the cops where to go and who to arrest, and is supposed to reduce both violent crime and the likelihood that suspects themselves will be killed -- but peer-reviewed analysis (Scihub mirror) of the program shows that while being on the Heat List increases your chances of being harassed and arrested by Chicago PD, it does not improve crime rates. Read the rest