In a shocking story on the German site Tagesschau (Google translate), Lena Kampf, Jacob Appelbaum and John Goetz report on the rules used by the NSA to decide who is a "target" for surveillance.
Since the start of the Snowden story in 2013, the NSA has stressed that while it may intercept nearly every Internet user's communications, it only "targets" a small fraction of those, whose traffic patterns reveal some basis for suspicion. Targets of NSA surveillance don't have their data flushed from the NSA's databases on a rolling 48-hour or 30-day basis, but are instead retained indefinitely.
The authors of the Tagesschau story have seen the "deep packet inspection" rules used to determine who is considered to be a legitimate target for deep surveillance, and the results are bizarre.
According to the story, the NSA targets anyone who searches for online articles about Tails — like this one that we published in April, or this article for teens that I wrote in May — or Tor (The Onion Router, which we've been posted about since 2004). Anyone who is determined to be using Tor is also targeted for long-term surveillance and retention.
Tor and Tails have been part of the mainstream discussion of online security, surveillance and privacy for years. It's nothing short of bizarre to place people under suspicion for searching for these terms.
More importantly, this shows that the NSA uses "targeted surveillance" in a way that beggars common sense. It's a dead certainty that people who heard the NSA's reassurances about "targeting" its surveillance on people who were doing something suspicious didn't understand that the NSA meant people who'd looked up technical details about systems that are routinely discussed on the front page of every newspaper in the world.
But it's not the first time the NSA has deployed specialized, highly counterintuitive wordsmithing to play games with the public, the law and its oversight. From James Clapper's insistence that he didn't lie to Congress about spying on Americans because he was only intercepting all their data, but not looking at it all; to the internal wordgames on evidence in the original Prism leak in which the NSA claimed to have "direct access" to servers from Google, Yahoo, Microsoft, Apple, etc, even though this "direct access" was a process by which the FBI would use secret warrants to request information from Internet giants without revealing that the data was destined for the NSA.
I have known that this story was coming for some time now, having learned about its broad contours under embargo from a trusted source. Since then, I've discussed it in confidence with some of the technical experts who have worked on the full set of Snowden docs, and they were as shocked as I was.
One expert suggested that the NSA's intention here was to separate the sheep from the goats — to split the entire population of the Internet into "people who have the technical know-how to be private" and "people who don't" and then capture all the communications from the first group.
Another expert said that s/he believed that this leak may come from a second source, not Edward Snowden, as s/he had not seen this in the original Snowden docs; and had seen other revelations that also appeared independent of the Snowden materials. If that's true, it's big news, as Snowden was the first person to ever leak docs from the NSA. The existence of a potential second source means that Snowden may have inspired some of his former colleagues to take a long, hard look at the agency's cavalier attitude to the law and decency.
Update: Bruce Schneier also believes there is a second leaker.
Update 2: Appelbaum and others have posted an excellent English language article expanding on this in Der Erste.