How can you trust your browser?


Tim Bray's Trusting Browser Code explores the political and technical problems with trusting your browser, especially when you're using it to do sensitive things like encrypt and decrypt your email. In an ideal world, you wouldn't have to trust Google or any other "intermediary" service to resist warrants forcing it to turn over your sensitive communications, because it would be technically impossible for anyone to peek into the mail without your permission. But as Bray points out, the complexity and relative opacity of Javascript makes this kind of surety difficult to attain.

Bray misses a crucial political problem, though: the DMCA. Under US law (and similar laws all over the world), telling people about vulnerabilities in DRM is illegal, meaning that a bug in your browser that makes your email vulnerable to spying might be illegal to report, and will thus potentially never be fixed. Now that the World Wide Web Consortium and all the major browser vendors (even including Mozilla) have capitulated on adding DRM to the Web, this is the most significant political problem in the world of trusting your browser.

Coun­ter­mea­sure: Ver­i­fi­able code · JavaScrip­t, by its na­ture, re­quires that the source code of an app be down­load­ed be­fore it's ex­e­cut­ed. Sup­pose that we had a par­tic­u­lar ver­sion of an app that had been care­ful­ly au­dit­ed and ev­ery­one was con­vinced was ac­cept­ably se­cure and back-door-free. Sup­pose we had a way to ap­ply a dig­i­tal sig­na­ture to that ver­sion and then, when­ev­er some­one want­ed to run it, a way to check the sig­na­ture to make sure we were about to run the right ver­sion. This smells like a pret­ty in­ter­est­ing so­lu­tion to me. ¶

So I looked around to see if any­one was do­ing this. Wel­l, sort of. I in­clude the first thing I dis­cov­ered most­ly for amuse­ment val­ue: Signed Scripts in Mozil­la, promi­nent­ly la­beled as prob­a­bly not up­dat­ed since 1998, and with ref­er­ences to "Netscape Ob­ject Signing" and some­thing called SignTool. All it proves is that at some pre­vi­ous point in his­to­ry, some­body thought this was a good idea.

Peo­ple still do, in some con­texts: Over at Fire­fox Mar­ket­place there's a write­up on Pack­aged apps, where all the re­sources come out of a zip­file. It says: "privileged and cer­ti­fied apps are dig­i­tal­ly signed to en­able the use of priv­i­leged and cer­ti­fied APIs. Priv­i­leged apps are signed as part of the Mar­ket­place re­view pro­cess, while cer­ti­fied apps are signed by de­vice man­u­fac­tur­ers or operators."

I won­dered if there were some­thing sim­i­lar on the Chrome side. I dug around in the docs and, sure enough, a Chrome ex­ten­sion is signed with the developer's pri­vate key. (I couldn't find any­thing sim­i­lar for Chrome apps as op­posed to ex­ten­sion­s, but maybe I was just look­ing in the wrong place).

So this no­tion of code sign­ing is not rad­i­cal in the slight­est; is any­one work­ing on mak­ing it ac­ces­si­ble for ar­bi­trary chunks of ap­pli­ca­tion JavaScrip­t?

Trusting Browser Code [Tim Bray/Ongoing]

(Image: uncle sam wants your privacy, Jeff Shuler, CC-BY)