Tim Bray's Trusting Browser Code explores the political and technical problems with trusting your browser, especially when you're using it to do sensitive things like encrypt and decrypt your email. In an ideal world, you wouldn't have to trust Google or any other "intermediary" service to resist warrants forcing it to turn over your sensitive communications, because it would be technically impossible for anyone to peek into the mail without your permission. But as Bray points out, the complexity and relative opacity of Javascript makes this kind of surety difficult to attain.
Bray misses a crucial political problem, though: the DMCA. Under US law (and similar laws all over the world), telling people about vulnerabilities in DRM is illegal, meaning that a bug in your browser that makes your email vulnerable to spying might be illegal to report, and will thus potentially never be fixed. Now that the World Wide Web Consortium and all the major browser vendors (even including Mozilla) have capitulated on adding DRM to the Web, this is the most significant political problem in the world of trusting your browser.
Countermeasure: Verifiable code · JavaScript, by its nature, requires that the source code of an app be downloaded before it's executed. Suppose that we had a particular version of an app that had been carefully audited and everyone was convinced was acceptably secure and back-door-free. Suppose we had a way to apply a digital signature to that version and then, whenever someone wanted to run it, a way to check the signature to make sure we were about to run the right version. This smells like a pretty interesting solution to me. ¶
So I looked around to see if anyone was doing this. Well, sort of. I include the first thing I discovered mostly for amusement value: Signed Scripts in Mozilla, prominently labeled as probably not updated since 1998, and with references to "Netscape Object Signing" and something called SignTool. All it proves is that at some previous point in history, somebody thought this was a good idea.
People still do, in some contexts: Over at Firefox Marketplace there's a writeup on Packaged apps, where all the resources come out of a zipfile. It says: "privileged and certified apps are digitally signed to enable the use of privileged and certified APIs. Privileged apps are signed as part of the Marketplace review process, while certified apps are signed by device manufacturers or operators."
I wondered if there were something similar on the Chrome side. I dug around in the docs and, sure enough, a Chrome extension is signed with the developer's private key. (I couldn't find anything similar for Chrome apps as opposed to extensions, but maybe I was just looking in the wrong place).
So this notion of code signing is not radical in the slightest; is anyone working on making it accessible for arbitrary chunks of application JavaScript?
Trusting Browser Code [Tim Bray/Ongoing]
(Image: uncle sam wants your privacy, Jeff Shuler, CC-BY)