Hitchens and Orwell

The Humanist's Anthony Lock on why Christopher Hitchens' overwhelming hatred of totalitarianism puts him squarely in Orwell's tradition of leftist apostasy: "Try as best as you can, he challenged us, to not allow one belief to squander clear thinking about another ... [or] a kind of worship whereby anything deemed negative against the topic or person, even the act of criticizing, is illicit. This is totalitarian, he warned: a control over one’s head and what can be said, creating corrosive preconceptions." Read the rest

Why the DHS's pre-crime biometric profiling is doomed to fail, and will doom passengers with its failures

In The Atlantic, Alexander Furnas debunks the DHS's proposal for a "precrime" screening system that will attempt to predict which passengers are likely to commit crimes, and single those people out for additional screening. FAST (Future Attribute Screening Technology) "will remotely monitor physiological and behavioral cues, like elevated heart rate, eye movement, body temperature, facial patterns, and body language, and analyze these cues algorithmically for statistical aberrance in an attempt to identify people with nefarious intentions." They'll build the biometric "bad intentions" profile by asking experimental subjects to carry out bad deeds and monitoring their vital signs. It's a mess, scientifically, and it will falsely accuse millions of innocent people of planning terrorist attacks.

First, predictive software of this kind is undermined by a simple statistical problem known as the false-positive paradox. Any system designed to spot terrorists before they commit an act of terrorism is, necessarily, looking for a needle in a haystack. As the adage would suggest, it turns out that this is an incredibly difficult thing to do. Here is why: let's assume for a moment that 1 in 1,000,000 people is a terrorist about to commit a crime. Terrorists are actually probably much much more rare, or we would have a whole lot more acts of terrorism, given the daily throughput of the global transportation system. Now lets imagine the FAST algorithm correctly classifies 99.99 percent of observations -- an incredibly high rate of accuracy for any big data-based predictive model. Even with this unbelievable level of accuracy, the system would still falsely accuse 99 people of being terrorists for every one terrorist it finds.

Read the rest