Chicago PD's predictive policing tool has been shut down after 8 years of catastrophically bad results

In 2012, Chicago PD collaborated with the RAND Corporation and the Illinois Institute of Technology to automatically generate "risk scores" for people they arrested, which were supposed to predict the likelihood that the person would be a "party to violence" in the future (this program was called "TRAP" — Targeted Repeat-Offender Apprehension Program" — seemingly without a shred of irony). — Read the rest

New Orleans Police used predictive policing without telling the city's elected officials

Palantir Technologies is a data-mining firm that loves it some predictive policing: computer-aided sorcery that uses data models to try and predict where crimes may occur and who's got a reasonable chance of committing them.

For predictive policing to work well, the predictive model being built needs to be well fed with data on criminals, their first, second and third-person acquaintances, their social media accounts, and crime statistics for the area where the model is meant to be seeing crimes before they may possibly happen. — Read the rest

Predictive policing predicts police harassment, not crime

In Chicago, the "Heat List" system is used to direct policing resources, based on data-mining of social media to identify potential gang-members; the model tells the cops where to go and who to arrest, and is supposed to reduce both violent crime and the likelihood that suspects themselves will be killed — but peer-reviewed analysis (Scihub mirror) of the program shows that while being on the Heat List increases your chances of being harassed and arrested by Chicago PD, it does not improve crime rates.

Leaked documents document China's plan for mass arrests and concentration-camp internment of Uyghurs and other ethnic minorities in Xinjiang

The International Consortium of Investigative Journalists has published five leaked Chinese intelligence memos — a lengthy "telegram" and four shorter "bulletins" — from 2017, which detail the plans to enact a program of mass incarceration for members of predominantly Muslim ethnic minorities (especially Uyghurs) in China's Xinjiang province.

How to recognize AI snake oil

Princeton computer scientist Arvind Narayanan (previously) has posted slides and notes from a recent MIT talk on "How to recognize AI snake oil" in which he divides AI applications into three (nonexhaustive) categories and rates how difficult they are, and thus whether you should believe vendors who claim that their machine learning models can perform as advertised.

Babysitter vetting and voice-analysis: Have we reached peak AI snakeoil?

The ever-useful Gartner Hype Cycle identified an inflection point in the life of any new technology: the "Peak of Inflated Expectations," attained just before the sharp dropoff into the "Trough of Disillusionment"; I've lived through the hype-cycles of several kinds of technology and one iron-clad correlate of the "Peak of Inflated Expectations" is the "Peak of Huckster Snakeoil Salesmen": the moment at which con-artists just add a tech buzzword to some crooked scam and head out into the market to net a fortune before everyone gets wise to the idea that the shiny new hypefodder isn't a magic bullet.

I Can't Breathe: Matt Taibbi's scorching book on the murder of Eric Garner and the system that let the killers get away with it

Matt Taibbi is one of the best political writers working in the USA today, someone who can use the small, novelistic details of individuals' lives illuminate the vast, systemic problems that poison our lives and shame our honor; his 2014 book The Divide conducts a wide-ranging inquiry into the impunity of corporate criminals and the kafkaesque injustices visited on the poor people they victimize; in I Can't Breathe: A Killing on Bay Street, Taibbi narrows his focus to the police murder of Eric Garner, a Staten Island fixture and father, and the system that put murderers in uniform in his path.

Chelsea Manning: we're spied on all the time, and the state still can't figure out who we are

Chelsea Manning spent seven years in federal prison for blowing the whistle on illegal actions by the US in Iraq and around the world; while imprisoned, she transitioned her gender and changed her name, and, on her release, found herself unpersoned, unable to identify herself to the satisfaction of the state, despite being one of the most famous people in America and despite the state's unquenchable thirst for our personal data (and her's especially).