UK's unaccountable crowdsourced blacklist to be crosslinked to facial recognition system

Facewatch is a private-public system that shopkeepers and the police use to keep track of "persons of interest," a list that includes anyone a shopkeeper doesn't like and registers with the system. Read the rest

Gun enthusiasts show up at Pokémon finals, police catch 'em all

Kevin Norton and James Stumbo were arrested this weekend near the Pokémon World Championship after showing up with a 12-gauge shotgun and an AR-15 they boasted about on social media. Read the rest

Why the DHS's pre-crime biometric profiling is doomed to fail, and will doom passengers with its failures

In The Atlantic, Alexander Furnas debunks the DHS's proposal for a "precrime" screening system that will attempt to predict which passengers are likely to commit crimes, and single those people out for additional screening. FAST (Future Attribute Screening Technology) "will remotely monitor physiological and behavioral cues, like elevated heart rate, eye movement, body temperature, facial patterns, and body language, and analyze these cues algorithmically for statistical aberrance in an attempt to identify people with nefarious intentions." They'll build the biometric "bad intentions" profile by asking experimental subjects to carry out bad deeds and monitoring their vital signs. It's a mess, scientifically, and it will falsely accuse millions of innocent people of planning terrorist attacks.

First, predictive software of this kind is undermined by a simple statistical problem known as the false-positive paradox. Any system designed to spot terrorists before they commit an act of terrorism is, necessarily, looking for a needle in a haystack. As the adage would suggest, it turns out that this is an incredibly difficult thing to do. Here is why: let's assume for a moment that 1 in 1,000,000 people is a terrorist about to commit a crime. Terrorists are actually probably much much more rare, or we would have a whole lot more acts of terrorism, given the daily throughput of the global transportation system. Now lets imagine the FAST algorithm correctly classifies 99.99 percent of observations -- an incredibly high rate of accuracy for any big data-based predictive model. Even with this unbelievable level of accuracy, the system would still falsely accuse 99 people of being terrorists for every one terrorist it finds.

Read the rest