In a live Q&A conducted on Twitter yesterday, Edward Snowden answered questions about the state of security and freedom in the world. While the whole interview in interesting, I was most struck by his answer to a question on how states should react to the news of widespread snooping:
We need to work together to agree on a reasonable international norm for the limitations on spying. Nobody should be hacking critical-to-life infrastructure like hospitals and power stations, and it’s fair to say that can be recognized in international law.
Additionally, we need to recognize that national laws are not going to solve the problem of indiscriminate surveillance. A prohibition in Burundi isn’t going to stop the spies in Greenland. We need a global forum, and global funding, committed to the development of security standards that enforce our right to privacy not through law, but through science and technology. The easiest way to ensure a country’s communications are secure is to secure them world-wide, and that means better standards, better crypto, and better research.
I agree that technology is a very important -- crucial and necessary -- part of the answer, but I think it's incomplete on its own. The right way to think about this is Lessig's idea that regulation is accomplished by norms, laws, technology and markets. It's important that we have technology that is as good as it can be in order to keep people secure -- that's why it's so disastrous that the W3C has paved the way for DRM in all browsers, because it's illegal to report defects in DRM, and that means that browsers (which are in every pocket and on every desktop, and are thus very valuable to attackers) will have long-lived vulnerabilities that can be exploited by snoops and crooks.
But as we see from the DRM example, there's also a legal question: if it's illegal to make technology secure, then technology will be less secure. And there's a normative question: if people believe that it's reasonable and proportionate to design computers that treat their owners as untrusted hostiles, then we'll get that kind of computer. If there's widespread consensus that problems shouldn't be solved by designing computers to attack their owners, then we'll get less of that.
Finally, there's markets: if there's normative consensus that privacy is worth doing something about, and no laws preventing pro-privacy technology, and technology that can supply good privacy, then companies will manufacture and sell those technologies, in states that default to privacy.
All four of these mechanisms -- markets, norms, tech and law -- strengthen one another. No one is sufficient on its own, but each one makes it easier to attain the rest.