Meet the spooky tech companies getting rich by making NSA surveillance possible

Wildly profitable companies like Neustar, Subsentio, and Yaana do the feds' dirty work for them, slurping huge amounts of unconstitutionally requisitioned data out of telcos' and ISPs' data-centers in response to secret, sealed FISA warrants -- some of them publicly traded, too, making them a perfect addition to the Gulag Wealth Fund.

Read the rest

Why the FBI's plan to require weak security in all American technology is a terrible, terrible idea

Bruce Schneier's editorial on CALEA-II is right on. In case you missed it, CALEA II is the FBI's proposal to require all American computers, mobile devices, operating systems, email programs, browsers, etc, to have weak security so that they can eavesdrop on them (as a side note, a CALEA-II rule would almost certainly require a ban on free/open source software, since code that can be modified is code that can have the FBI back-doors removed).

The FBI believes it can have it both ways: that it can open systems to its eavesdropping, but keep them secure from anyone else's eavesdropping. That's just not possible. It's impossible to build a communications system that allows the FBI surreptitious access but doesn't allow similar access by others. When it comes to security, we have two options: We can build our systems to be as secure as possible from eavesdropping, or we can deliberately weaken their security. We have to choose one or the other.

This is an old debate, and one we've been through many times. The NSA even has a name for it: the equities issue. In the 1980s, the equities debate was about export control of cryptography. The government deliberately weakened U.S. cryptography products because it didn't want foreign groups to have access to secure systems. Two things resulted: fewer Internet products with cryptography, to the insecurity of everybody, and a vibrant foreign security industry based on the unofficial slogan "Don't buy the U.S. stuff -- it's lousy."

In 1994, the Communications Assistance for Law Enforcement Act mandated that U.S. companies build eavesdropping capabilities into phone switches. These were sold internationally; some countries liked having the ability to spy on their citizens. Of course, so did criminals, and there were public scandals in Greece (2005) and Italy (2006) as a result.

In 2012, we learned that every phone switch sold to the Department of Defense had security vulnerabilities in its surveillance system. And just this May, we learned that Chinese hackers breached Google's system for providing surveillance data for the FBI.

The Problems with CALEA-II

Computer scientists to FBI: don't require all our devices to have backdoors for spies

In an urgent, important blog post, computer scientist and security expert Ed Felten lays out the case against rules requiring manufacturers to put wiretapping backdoors in their communications tools. Since the early 1990s, manufacturers of telephone switching equipment have had to follow a US law called CALEA that says that phone switches have to have a deliberate back-door that cops can use to secretly listen in on phone calls without having to physically attach anything to them. This has already been a huge security problem -- through much of the 1990s, AT&T's CALEA controls went through a Solaris machine that was thoroughly compromised by hackers, meaning that criminals could listen in on any call; during the 2005/6 Olympic bid, spies used the CALEA backdoors on the Greek phone company's switches to listen in on the highest levels of government.

But now, thanks to the widespread adoption of cryptographically secured messaging services, law enforcement is finding that its CALEA backdoors are of declining utility -- it doesn't matter if you can intercept someone else's phone calls or network traffic if the data you're captured is unbreakably scrambled. In response, the FBI has floated the idea of "CALEA II": a mandate to put wiretapping capabilities in computers, phones, and software.

As Felten points out, this is a terrible idea. If your phone is designed to secretly record you or stream video, location data, and messages to an adverse party, and to stop you from discovering that it's doing this, it puts you at huge risk when that facility is hijacked by criminals. It doesn't matter if you trust the government not to abuse this power (though, for the record, I don't -- especially since anything mandated by the US government would also be present in devices used in China, Belarus and Iran) -- deliberately weakening device security makes you vulnerable to everyone, including the worst criminals:

Our report argues that mandating a virtual wiretap port in endpoint systems is harmful. The port makes it easier for attackers to capture the very same data that law enforcement wants. Intruders want to capture everything that happens on a compromised computer. They will be happy to see a built-in tool for capturing and extracting large amounts of audio, video, and text traffic. Better yet (for the intruder), the capability will be stealthy by design, making it difficult for the user to tell that anything is amiss.

Beyond this, the mandate would make it harder for users to understand, monitor, and fix their own systems—which is bad for security. If a system’s design is too simple or its operation too transparent or too easy to monitor, then wiretaps will be evident. So a wiretappability mandate will push providers toward complex, obfuscated designs that are harder to secure and raise the total cost of building and operating the system.

Finally, our report argues that it will not be possible to block non-compliant implementations. Many of today’s communication tools are open source, and there is no way to hide a capability within an open source code base, nor to prevent people from simply removing or disabling an undesired feature. Even closed source systems are routinely modified by users—as with jailbreaking of phones—and users will find ways to disable features they don’t want. Criminals will want to disable these features. Ordinary users will also want to disable them, to mitigate their security risks.

Felten's remarks summarize a report [PDF] signed by 20 distinguished computer scientists criticizing the FBI's proposal. It's an important read -- maybe the most important thing you'll read all month. If you can't trust your devices, you face enormous danger.

CALEA II: Risks of wiretap modifications to endpoints

Brochures from the companies that sell malware to governments

Ars Technica has a small gallery of the latest Wikileaks dump, consisting of brochures from companies that sell malicious software to governments for use in spying on their citizens. I spoke at length with one of the sources for these and we agreed that it was freakishly weird and scary -- I've spent the past two months in a bit of a paranoid stupor as a result. On the other hand, I have seen enough product brochures to know that companies often stretch the truth when they're pimping their products, and I wouldn't expect truth-in-advertising ethics from vichy nerds that specialize in violating the UN Declaration of Human Rights.

One product marketed by HackingTeam is the Remote Control System, malware that infects computers and smartphones in order to enable covert surveillance. The company says that its trojan can intercept encrypted communication, including Skype voice calls. They prominently advertise the fact that the malware can be installed remotely. They say that it can scale up to monitor "hundreds of thousands of targets" and is capable of being deployed to Apple, Android, Symbian, and Blackberry mobile devices.

Gallery: how the surveillance industry markets spyware to governments