"predictive policing"

Chicago PD's predictive policing tool has been shut down after 8 years of catastrophically bad results

In 2012, Chicago PD collaborated with the RAND Corporation and the Illinois Institute of Technology to automatically generate "risk scores" for people they arrested, which were supposed to predict the likelihood that the person would be a "party to violence" in the future (this program was called "TRAP" -- Targeted Repeat-Offender Apprehension Program" -- seemingly without a shred of irony). Now, that program has been shut down, and the City of Chicago's Office of the Inspector General has published a damning report on its eight-year reign, revealing the ways in which the program discriminated against the people ensnared in it, without reducing violent crime. Read the rest

Leaked documents document China's plan for mass arrests and concentration-camp internment of Uyghurs and other ethnic minorities in Xinjiang

The International Consortium of Investigative Journalists has published five leaked Chinese intelligence memos -- a lengthy "telegram" and four shorter "bulletins" -- from 2017, which detail the plans to enact a program of mass incarceration for members of predominantly Muslim ethnic minorities (especially Uyghurs) in China's Xinjiang province. Read the rest

How to recognize AI snake oil

Princeton computer scientist Arvind Narayanan (previously) has posted slides and notes from a recent MIT talk on "How to recognize AI snake oil" in which he divides AI applications into three (nonexhaustive) categories and rates how difficult they are, and thus whether you should believe vendors who claim that their machine learning models can perform as advertised. Read the rest

Mysterious New Orleans "anti-crime" camera emblazoned with NOPD logos outside surveillance contractor's house is disavowed by NOPD

New Orleans is festooned with police cameras, the legacy of a secret partnership with the surveillance contractor Palantir, which used New Orleans as a covert laboratory for predictive policing products. Read the rest

Police cameras to be augmented with junk-science "microexpression" AI lie-detectors

The idea that you can detect lies by analyzing "microexpressions" has absorbed billions in spending by police forces and security services, despite the fact that it's junk science that performs worse than a coin-toss. Read the rest

Beyond GIGO: how "predictive policing" launders racism, corruption and bias to make them seem empirical

"Predictive policing" is the idea that you can feed crime stats to a machine-learning system and it will produce a model that can predict crime. It is garbage. Read the rest

Chasing down that list of potential Predpol customers reveals dozens of cities that have secretly experimented with "predictive policing"

Last October, I published a list of cities that appeared to have contracted with Predpol, a "predictive policing" company, based on research provided to me by an anonymous source who used clever methods to uncover the customer list. Read the rest

Babysitter vetting and voice-analysis: Have we reached peak AI snakeoil?

The ever-useful Gartner Hype Cycle identified an inflection point in the life of any new technology: the "Peak of Inflated Expectations," attained just before the sharp dropoff into the "Trough of Disillusionment"; I've lived through the hype-cycles of several kinds of technology and one iron-clad correlate of the "Peak of Inflated Expectations" is the "Peak of Huckster Snakeoil Salesmen": the moment at which con-artists just add a tech buzzword to some crooked scam and head out into the market to net a fortune before everyone gets wise to the idea that the shiny new hypefodder isn't a magic bullet. Read the rest

Is this the full list of US cities that have bought or considered Predpol's predictive policing services?

Predpol (previously) is a "predictive policing" company that sells police forces predictive analytics tools that take in police data about crimes and arrests and spits out guesses about where the police should go to find future crimes. Read the rest

Garbage In, Garbage Out: machine learning has not repealed the iron law of computer science

Pete Warden writes convincingly about computer scientists' focus on improving machine learning algorithms, to the exclusion of improving the training data that the algorithms interpret, and how that focus has slowed the progress of machine learning. Read the rest

New Orleans Police used predictive policing without telling the city's elected officials

Palantir Technologies is a data-mining firm that loves it some predictive policing: computer-aided sorcery that uses data models to try and predict where crimes may occur and who's got a reasonable chance of committing them.

For predictive policing to work well, the predictive model being built needs to be well fed with data on criminals, their first, second and third-person acquaintances, their social media accounts, and crime statistics for the area where the model is meant to be seeing crimes before they may possibly happen. It sounds like shit right out of Minority Report, because it kinda is – just without spooky kids in a swimming pool and a hell of a lot less accuracy.

Accurate or not, the notion of predictive policing raises a number of civil rights and privacy concerns. The ACLU isn't down with it, as the methodology of stopping someone without reasonable suspicion is against the Fourth Amendment. In their eyes, computer-aided guesses don't cut it when it comes to justifying a stop-and-frisk. China's been using it to snoop on their citizens and has been sending suspected radicals and political dissidents for re-education, just in case they decided to protest their nation's ruling party's status quo. It's creepy shit.

Anyway, back to Palantir.

Did I mention that it was started up by Peter Thiel with money seeded by the CIA? No? How about the fact that they've been running an off-the-books program with the New Orleans Police so secretive that the city's own government didn't have a clue that it was going on? Read the rest

I Can't Breathe: Matt Taibbi's scorching book on the murder of Eric Garner and the system that let the killers get away with it

Matt Taibbi is one of the best political writers working in the USA today, someone who can use the small, novelistic details of individuals' lives illuminate the vast, systemic problems that poison our lives and shame our honor; his 2014 book The Divide conducts a wide-ranging inquiry into the impunity of corporate criminals and the kafkaesque injustices visited on the poor people they victimize; in I Can't Breathe: A Killing on Bay Street, Taibbi narrows his focus to the police murder of Eric Garner, a Staten Island fixture and father, and the system that put murderers in uniform in his path.

Chelsea Manning: we're spied on all the time, and the state still can't figure out who we are

Chelsea Manning spent seven years in federal prison for blowing the whistle on illegal actions by the US in Iraq and around the world; while imprisoned, she transitioned her gender and changed her name, and, on her release, found herself unpersoned, unable to identify herself to the satisfaction of the state, despite being one of the most famous people in America and despite the state's unquenchable thirst for our personal data (and her's especially). Read the rest

Case study of LAPD and Palantir's predictive policing tool: same corruption; new, empirical respectability

UT Austin sociologist Sarah Brayne spent 2.5 years conducting field research with the LAPD as they rolled out Predpol, a software tool that is supposed to direct police to places where crime is likely to occur, but which has been shown to send cops out to overpolice brown and poor people at the expense of actual crimefighting. Read the rest

Google and Facebook's "fake news" ban is a welcome nail in the coffin of "software objectivity"

In the wake of the Trump election -- a triumph of fake news -- both Google and Facebook have announced that they will take countermeasures to exclude "fake news" from their services, downranking them in the case of Facebook and cutting them off from ad payments in Google's case. Read the rest

Though crime happens everywhere, predictive policing tools send cops to poor/black neighborhoods

Researchers from the Human Rights Data Analysis Group (previously) reimplemented the algorithm Predpol predictive policing system that police departments around America have spent a fortune on in order to find out where to set their patrols, and fed it Oakland's 2010 arrest data, then asked it to predict where the crime would be in 2011. Read the rest

Rules for trusting "black boxes" in algorithmic control systems

Tim O'Reilly writes about the reality that more and more of our lives -- including whether you end up seeing this very sentence! -- is in the hands of "black boxes": algorithmic decision-makers whose inner workings are a secret from the people they affect. Read the rest

Next page

:)