An extended version of this piece was originally published December, 2015, on the Association for Computing Machinery's Huffington Post blog. It has been excerpted and updated slightly here to speak to the recent news around Apple and the FBI's request for backdoor access.
We are not here to debate whether such access is useful from a policy perspective, i.e. whether it would work to stop bad guys. While a critical conversation that raises many questions, we leave that to others. We are here to review the technical realities, and to explore the impact and potential danger of such proposals from this perspective.
The ability to secure Internet technologies — to ensure that the right people gain access to the right things, and the wrong people don't — is what makes online banking and commerce possible, and is what allowed the Internet to become an unprecedented driver of economic and social change. This point is not up for debate. What we now need to understand is that the call to provide law enforcement (or anyone) exceptional access to communications and content poses a grave threat to the sustainability and future of the Internet: it is simply not possible to give the good guys the access they want without letting the bad guys in. There's nothing new or novel in this statement. Experts have been saying the same thing for 20 years. But while the message is old, with the integration of Internet technologies into nearly all aspects of life, the stakes are higher than they've ever been.
Machines don't know a bad guy from a good guy. Machines respond as they've been programmed to respond. Programming them (with new software, or otherwise) to open up to third parties cannot be guaranteed to limit access to only those intended: it limits access to anyone who is able to make a request in a way that the machine responds to. In the case of Apple, the FBI is requesting new software that would enable them to crack an iPhone user's password and bypass the security measures in place to prevent such intrusion. Were Apple to build such software, it would create a "backdoor" into one criminal's iPhone, and any other iPhone model with which the software requested can be used. Apple is being asked to give the FBI — and anyone else who obtains the software in question — a ticket to exceptional access.
The risks are not theoretical: we know of no case where adding extraordinary access capabilities to a system has not resulted in weakened security.
Take the case of the Communications Assistance for Law Enforcement Act (CALEA), a 1994 law designed to make it easier for US law enforcement to tap phone conversations. Under this law, telephone companies had to design their systems to allow wiretapping — adding a vector for extraordinary access (similar in kind to what's being requested of Apple). It was due to CALEA-mandated wiretapping capabilities that, in 2012 all of the Department of Defense's phone switches were reported to be vulnerable. Similar capabilities, built to comply with CALEA-like laws, were exploited to eavesdrop on the phone conversations of Greece's Prime Minister and those of at least 100 other dignitaries and politicians, some of them US diplomats. It was these same mechanisms that were used to illegally tap phone conversations of at least 5000 people in Italy. In the Greek case, it's unclear who did it. In the Italian case, the crime appears to have been authorized by a high-ranking official at the Italian SISMI military intelligence agency. From our perspective it doesn't matter — if the means for extraordinary access weren't there, these crimes almost certainly wouldn't have happened.
The fact that backdoors create technical vulnerabilities is not the only issue. In a global world, in which multinational companies like Apple deploy hardware, applications, and communications services to markets everywhere, how do we determine whose law enforcement and government should be allowed to use this exceptional access, and for what purpose? Who are the "good guys," and according to whom? Should the Chinese, Canadian, US, and South Sudanese governments all be granted access under the same terms? Whose agendas and policies do we favor, and what does consensus look like? Who governs and audits such decisions, and how can they be implemented in an industry reliant on innovation and speed? And, finally, how do we program millions of machines to respect the huge and dynamic complexity of such decisions, assuming such a process is even possible?
Combine the technical realities with these procedural questions, and you see a recipe for potentially security disaster. Imagine if it weren't phone switches that were vulnerable via exceptional access capabilities, but the computers that run critical national infrastructure, the databases that store medical records, the intellectual property of major US economic interests, the engines of the global financial industry. Closer to home, imagine the frighteningly-plausible scenario of a bad actor obtaining the FBI-requested iPhone cracking software, and using it not to catch criminals but to access national secrets, intellectual property, or personal information from high-ranking officials and businesspeople. Now recognize that misuse only has to happen once to cause unspeakable harm to national economic and security interests.
None of this means that the job of tracking and apprehending terrorists and other wrongdoers on a global scale is easy. Or that the frustration felt by those tasked with keeping populations safe isn't very real. However, the palpable immediacy of these problems does not mean that extraordinary access is a workable idea. Put another way — however much it might appear like exceptional access is a silver bullet, it is not. Instead such a path would weaken our collective security.
 In using the term "exceptional access" we take our lead from the authors of 2015's definitive and highly-recommended Keys Under Doormats paper, who in turn followed the lead of the "1996 US National Academy of Sciences CRISIS report in using the phrase 'exceptional access' to mean that 'the situation is not one that was included within the intended bounds of the original transaction.'"
 In an order from a judge, the FBI requested Apple create "limited" custom software that would work to provide access only to the iPhone in question. Many knowledgeable people have affirmed that this is, theoretically, possible, using a per-phone identifier that would ensure the software executed solely on the one device. Apple claims that it is not able to guarantee such software is limited in such a way (a reasonable claim, given the complexity of what's being requested). The key point is that Apple is being asked to create a system that, even if limited, could be used again with only slight modifications, that sets a precedent under a troubling law, and that announces to the world in the US and beyond a means of backdoor access.
Meredith Whittaker and Ben Laurie are co-founders of Simply Secure, an organization that focuses on improving the design of secure technologies. This among many other things.