The FBI wants backdoors in all your crypto, and UK Prime Minister David Cameron made backdoors an election promise, but as Stanford lawyer/computer scientist Jonathan Mayer writes, there's no way to effectively backdoor modern platforms without abolishing the whole idea of computers as we know them, replacing them with an imaginary and totalitarian computing ecosystem that does not exist and probably never will.
Mayer gives the example of how stopping Android users from using crypto would require the abolition of third-party app stores, rolling back the state of the art in Web-based apps, introducing kill-switches to the platform that lets Google delete your apps and the data associated with them, and preventing jailbreaking at all costs.
He mentions that the same is true for Ios, though that's not exactly right -- it's a felony to jailbreak many Ios devices (Iphones, but not Ipads, are temporarily exempted from this thanks to a Copyright Office ruling that expires this year), and it's a felony to run a third-party Ios app store and supply jailbreaking tools for Ios. DRM-locked ecosystems are already designed to prevent users from running code that their users desire, and so it's conceptually a lot easier to understand how a government could simply say to all those companies -- Sony, Nintendo, Apple, Nest, John Deere, etc -- that the law required them to only approve apps with backdoors and then help the companies with their existing project of vigorously prosecuting jailbreak tool-makers, and get a much more airtight seal around users' ability to use good crypto.
One option: require Google to police its app store for strong cryptography. Another option: mandate a notice-and-takedown system, where the government is responsible for spotting secure apps, and Google has a grace period to remove them. Either alternative would, of course, be entirely unacceptable to the technology sector—the DMCA’s notice-and-takedown system is widely reviled, and present federal law (CDA 230) disfavors intermediary liability.
This hypothetical is already beyond the realm of political feasibility, but keep going. Assume the federal government sticks Google with intermediary liability. How will Google (or the government) distinguish between apps that have strong cryptography and apps that have backdoored cryptography?
There isn’t a good solution. Auditing app installation bundles, or even requiring developers to hand over source code, would not be sufficient. Apps can trivially download and incorporate new code. Auditing running apps would add even more complexity. And, at any rate, both static and dynamic analysis are unsolved challenges—just look at how much trouble Google has had identifying malware and knockoff apps.
Continue with the hypothetical, though. Imagine that Google could successfully banish secure encryption apps from the official Google Play store. What about apps that are loaded from another app store? The government could feasibly regulate some competitors, like the Amazon Appstore. How, though, would it reach international, free, open source app repositories like F-Droid or Fossdroid? What about apps that a user directly downloads and installs (“sideloads”) from a developer’s website?
The only solution is an app kill switch.3 (Google’s euphemism is “Remote Application Removal.”) Whenever the government discovers a strong encryption app, it would compel Google to nuke the app from Android phones worldwide. That level of government intrusion—reaching into personal devices to remove security software—certainly would not be well received. It raises serious Fourth Amendment issues, since it could be construed as a search of the device or a seizure of device functionality and app data.4 What’s more, the collateral damage would be extensive; innocent users of the app would lose their data.
Designing an effective app kill switch also isn’t so easy. The concept is feasible for app store downloads, since those apps are tagged with a consistent identifier. But a naïve kill switch design is trivial to circumvent with a sideloaded app. The developer could easily generate a random application identifier for each download.5
Google would have to build a much more sophisticated kill switch, scanning apps for prohibited traits. Think antivirus, but for detecting and removing apps that the user wants. That’s yet another unsolved technical challenge, yet another objectionable intrusion into personal devices, and yet another practice with constitutional vulnerability.
You Can’t Backdoor a Platform [Jonathan Mayer]