In 2012, Chicago PD collaborated with the RAND Corporation and the Illinois Institute of Technology to automatically generate "risk scores" for people they arrested, which were supposed to predict the likelihood that the person would be a "party to violence" in the future (this program was called "TRAP" — Targeted Repeat-Offender Apprehension Program" — seemingly without a shred of irony). — Read the rest
"Predictive policing" is the idea that you can feed crime stats to a machine-learning system and it will produce a model that can predict crime. It is garbage.
Last October, I published a list of cities that appeared to have contracted with Predpol, a "predictive policing" company, based on research provided to me by an anonymous source who used clever methods to uncover the customer list.
Predpol (previously) is a "predictive policing" company that sells police forces predictive analytics tools that take in police data about crimes and arrests and spits out guesses about where the police should go to find future crimes.
Palantir Technologies is a data-mining firm that loves it some predictive policing: computer-aided sorcery that uses data models to try and predict where crimes may occur and who's got a reasonable chance of committing them.
For predictive policing to work well, the predictive model being built needs to be well fed with data on criminals, their first, second and third-person acquaintances, their social media accounts, and crime statistics for the area where the model is meant to be seeing crimes before they may possibly happen. — Read the rest
UT Austin sociologist Sarah Brayne spent 2.5 years conducting field research with the LAPD as they rolled out Predpol, a software tool that is supposed to direct police to places where crime is likely to occur, but which has been shown to send cops out to overpolice brown and poor people at the expense of actual crimefighting.
Researchers from the Human Rights Data Analysis Group (previously) reimplemented the algorithm Predpol predictive policing system that police departments around America have spent a fortune on in order to find out where to set their patrols, and fed it Oakland's 2010 arrest data, then asked it to predict where the crime would be in 2011.
In Chicago, the "Heat List" system is used to direct policing resources, based on data-mining of social media to identify potential gang-members; the model tells the cops where to go and who to arrest, and is supposed to reduce both violent crime and the likelihood that suspects themselves will be killed — but peer-reviewed analysis (Scihub mirror) of the program shows that while being on the Heat List increases your chances of being harassed and arrested by Chicago PD, it does not improve crime rates.
George Zimmerman profiled Trayvon Martin by making assumptions about his appearance and clothing, negative meanings intended to criminalize and then justify Zimmerman killing Martin, while claiming self-defense. Remember, Zimmerman not only had a gun, but clearly revealed his motivations and assumptions in the 911 call. — Read the rest
We're getting closer and closer to Minority Report every day, aren't we? No, we don't mean philosophical discussions about free will vs. determinism or the encroachment on civil liberties or the ethics of living in a surveillance state. No, we're just talking about really, really good tech. — Read the rest
I recently re-watched the Spielberg-Tom Cruise big screen adaptation of Minority Report. It still holds up as a great sci-fi film about the dangers of the surveillance state, but it also loses something in the way it strays from the source material. — Read the rest
The International Consortium of Investigative Journalists has published five leaked Chinese intelligence memos — a lengthy "telegram" and four shorter "bulletins" — from 2017, which detail the plans to enact a program of mass incarceration for members of predominantly Muslim ethnic minorities (especially Uyghurs) in China's Xinjiang province.
Princeton computer scientist Arvind Narayanan (previously) has posted slides and notes from a recent MIT talk on "How to recognize AI snake oil" in which he divides AI applications into three (nonexhaustive) categories and rates how difficult they are, and thus whether you should believe vendors who claim that their machine learning models can perform as advertised.
New Orleans is festooned with police cameras, the legacy of a secret partnership with the surveillance contractor Palantir, which used New Orleans as a covert laboratory for predictive policing products.
The idea that you can detect lies by analyzing "microexpressions" has absorbed billions in spending by police forces and security services, despite the fact that it's junk science that performs worse than a coin-toss.
The ever-useful Gartner Hype Cycle identified an inflection point in the life of any new technology: the "Peak of Inflated Expectations," attained just before the sharp dropoff into the "Trough of Disillusionment"; I've lived through the hype-cycles of several kinds of technology and one iron-clad correlate of the "Peak of Inflated Expectations" is the "Peak of Huckster Snakeoil Salesmen": the moment at which con-artists just add a tech buzzword to some crooked scam and head out into the market to net a fortune before everyone gets wise to the idea that the shiny new hypefodder isn't a magic bullet.
Pete Warden writes convincingly about computer scientists' focus on improving machine learning algorithms, to the exclusion of improving the training data that the algorithms interpret, and how that focus has slowed the progress of machine learning.
Matt Taibbi is one of the best political writers working in the USA today, someone who can use the small, novelistic details of individuals' lives illuminate the vast, systemic problems that poison our lives and shame our honor; his 2014 book The Divide conducts a wide-ranging inquiry into the impunity of corporate criminals and the kafkaesque injustices visited on the poor people they victimize; in I Can't Breathe: A Killing on Bay Street, Taibbi narrows his focus to the police murder of Eric Garner, a Staten Island fixture and father, and the system that put murderers in uniform in his path.
Chelsea Manning spent seven years in federal prison for blowing the whistle on illegal actions by the US in Iraq and around the world; while imprisoned, she transitioned her gender and changed her name, and, on her release, found herself unpersoned, unable to identify herself to the satisfaction of the state, despite being one of the most famous people in America and despite the state's unquenchable thirst for our personal data (and her's especially).
In the wake of the Trump election — a triumph of fake news — both Google and Facebook have announced that they will take countermeasures to exclude "fake news" from their services, downranking them in the case of Facebook and cutting them off from ad payments in Google's case.