The speech was very well received — it got a standing ovation — and has attracted a lot of discussion since.
Jonke Suhr has done me the service of transcribing the talk, which will facilitate translating it into other languages as well as making it accessible to people who struggle with video. Many thanks, Jonke!
This is also available as an MP3 and a downloadable video.
I've included an edited version below:
So, as you might imagine, I'm here to talk to you about dieting advice. If you ever want to go on a diet, the first thing you should really do is throw away all your Oreos.
It's not that you don't want to lose weight when you raid your Oreo stash in the middle of the night. It's just that the net present value of tomorrow's weight loss is hyperbolically discounted in favor of the carbohydrate rush of tonight's Oreos. If you're serious about not eating a bag of Oreos your best bet is to not have a bag of Oreos to eat. Not because you're weak willed. Because you're a grown up. And once you become a grown up, you start to understand that there will be tired and desperate moments in your future and the most strong-willed thing you can do is use the willpower that you have now when you're strong, at your best moment, to be the best that you can be later when you're at your weakest moment.
And this has a name: It's called a Ulysses pact. Ulysses was going into Siren-infested waters. When you go into Siren-infested waters, you put wax in your ears so that you can't hear what the Sirens are singing, because otherwise you'll jump into the sea and drown. But Ulysses wanted to hear the Sirens. And so he came up with a compromise: He had his sailors tie him to the mast, so that when he heard the call of the Sirens, even though he would beg and gibber and ask them to untie him, so that he could jump into the sea, he would be bound to the mast and he would be able to sail through the infested waters.
This is a thing that economists talk about all the time, it's a really critical part of how you build things that work well and fail well. Now, building a Web that is decentralized is a hard thing to do, and the reason that the web ceases to be decentralized periodically is because it's very tempting to centralize things. There are lots of short term gains to be had from centralizing things and you want to be the best version of yourself, you want to protect your present best from your future worst.
The reason that the Web is closed today is that people just like you, the kind of people who went to Doug Engelbart's demo in 1968, the kind of people who went to the first Hackers conference, people just like you, made compromises, that seemed like the right compromise to make at the time. And then they made another compromise. Little compromises, one after another.
And as humans, our sensory apparatus is really only capable of distinguishing relative differences, not absolute ones. And so when you make a little compromise, the next compromise that you make, you don't compare it to the way you were when you were fresh and idealistic. You compare it to your current, "stained" state. And a little bit more stained hardly makes any difference. One compromise after another, and before you know it, you're suing to make APIs copyrightable or you're signing your name to a patent on one-click purchasing or you're filing the headers off of a GPL library and hope no one looks too hard at your binaries. Or you're putting a backdoor in your code for the NSA.
And the thing is: I am not better than the people who made those compromises. And you are not better than the people who made those compromises. The people who made those compromises discounted the future costs of the present benefits of some course of action, because it's easy to understand present benefits and it's hard to remember future costs.
You're not weak if you eat a bag of Oreos in the middle of the night. You're not weak if you save all of your friends' mortgages by making a compromise when your business runs out of runway. You're just human, and you're experiencing that hyperbolic discounting of future costs because of that immediate reward in the here and now. If you want to make sure that you don't eat a bag of Oreos in the middle of the night, make it more expensive to eat Oreos. Make it so that you have to get dressed and find your keys and figure out where the all-night grocery store is and drive there and buy a bag of Oreos. And that's how you help yourself in the future, in that moment where you know what's coming down the road.
The answer to not getting pressure from your bosses, your stakeholders, your investors or your members, to do the wrong thing later, when times are hard, is to take options off the table right now. This is a time-honored tradition in all kinds of economic realms. Union negotiators, before they go into a tough negotiation, will say: "I will resign as your negotiator, before I give up your pension." And then they sit down across the table from the other side, and the other side says "It's pensions or nothing". And the union leaders say: "I hear what you're saying. I am not empowered to trade away the pensions. I have to quit. They have to go elect a new negotiator, because I was elected contingent on not bargaining away the pensions. The pensions are off the table."
Brewster has talked about this in the context of code, he suggested that we could build distributed technologies using the kinds of JavaScript libraries that are found in things like Google Docs and Google Mail, because no matter how much pressure is put on browser vendors, or on technology companies in general, the likelihood that they will disable Google Docs or Google Mail is very, very low. And so we can take Google Docs hostage and use it as an inhuman shield for our own projects.
The GPL does this. Once you write code, with the GPL it's locked open, it's irrevocably licensed for openness and no one can shut it down in the future by adding restrictive terms to the license. The reason the GPL works so well, the reason it became such a force for locking things open, is that it became indispensable. Companies that wanted to charge admission for commodity components like operating systems or file editors or compilers found themselves confronted with the reality that there's a huge difference between even a small price and no price at all, or no monetary price. Eventually it just became absurd to think that you would instantiate a hundred million virtual machines for an eleventh of a second and get a license and a royalty for each one of them.
And at that point, GPL code became the only code that people used in cloud applications in any great volume, unless they actually were the company that published the operating system that wasn't GPL'd. Communities coalesced around the idea of making free and open alternatives to these components: GNU/Linux, Open- and LibreOffice, git, and those projects benefited from a whole bunch of different motives, not always the purest ones. Sometimes it was programmers who really believed ethically in the project and funded their own work, sometimes talent was tight and companies wanted to attract programmers, and the way that they got them to come through the door is by saying: "We'll give you some of your time to work on an ethical project and contribute code to it."
Sometimes companies got tactical benefits by zeroing out the margins on their biggest competitor's major revenue stream. So if you want to fight with Microsoft, just make Office free. And sometimes companies wanted to use but not sell commodity components. Maybe you want to run a cloud service but you don't want to be in the operating system business, so you put a bunch of programmers on making Linux better for your business, without ever caring about getting money from the operating system. Instead you get it from the people who hire you to run their cloud.
Everyone of those entities, regardless of how they got into this situation of contributing to open projects, eventually faced hard times, because hard times are a fact of life. And systems that work well, but fail badly, are doomed to die in flames. The GPL is designed to fail well. It makes it impossible to hyperbolically discount the future costs of doing the wrong thing to gain an immediate benefit. When your investor or your acquisition suitor or your boss say "Screw your ethics, hippie, we need to make payroll", you can just pull out the GPL and say: "Do you have any idea how badly we will be destroyed if we violate copyright law by violating the GPL?"
It's why Microsoft was right to be freaked out about the GPL during the Free and Open Source wars. Microsoft's coders were nerds like us, they fell in love with computers first, and became Microsoft employees second. They had benefited from freedom and openness, they had cated out BASIC programs, they had viewed sources, and they had an instinct towards openness. Combining that with the expedience of being able to use FLOSS, like not having to call a lawyer before you could be an engineer, and with the rational calculus, that if they made FLOSS, that when they eventually left Microsoft they could keep using the code that they had made there, meant that Microsoft coders and Microsoft were working for different goals. And the way they expressed that was in how they used and licensed their code.
This works so well that for a long time, nobody even knew if the GPL was enforceable, because nobody wanted to take the risk of suing and setting a bad precedent. It took years and years for us to find out in which jurisdictions we could enforce the GPL.
That brings me to another kind of computer regulation, something that has been bubbling along under the surface for a long time, at least since the Open Source wars, and that's the use of Digital Rights Management (DRM) or Digital Restrictions Management, as some people call it. This is the technology that tries to control how you use your computer. The idea is that you have software on the computer that the user can't override. If there is remote policy set on that computer that the user objects to, the computer rejects the user's instruction in favor of the remote policy. It doesn't work very well. It's very hard to stop people who are sitting in front of a computer from figuring out how it works and changing how it works. We don't keep safes in bank robbers' living rooms, not even really good ones.
But we have a law that protects it, the Digital Millennium Copyright Act (DMCA), it's been around since 1998 and it has lots of global equivalents like section 6 of the EUCD in Europe, implemented all across the EU member states. In New Zealand they tried to pass a version of the DMCA and there were uprisings and protests in the streets, they actually had to take the law off the books because it was so unpopular. And then the Christchurch earthquake hit and a member of parliament reintroduced it as a rider to the emergency relief bill to dig people out of the rubble. In Canada it's Bill C-11 from 2011. And what it does is, it makes it a felony to tamper with those locks, a felony punishable by 500,000 dollars fine and five years in jail for a first offense. It makes it a felony to do security auditing of those locks and publish information about the flaws that are present in them or their systems.
This started off as a way to make sure that people who bought DVDs in India didn't ship them to America. But it is a bad idea whose time has come. It has metastasized into every corner of our world. Because if you put just enough DRM around a product that you can invoke the law, then you can use other code, sitting behind the DRM, to control how the user uses that product, to extract more money. GM uses it to make sure that you can't get diagnostics out of the car without getting a tool that they license to you, and that license comes with a term that says you have to buy parts from GM, and so all repair shops for GM that can access your diagnostic information have to buy their parts from GM and pay monopoly rents.
We see it in insulin pumps, we see it in thermostats and we see it in the "Internet of Things rectal thermometer", which debuted at CES this year, which means we now have DRM restricted works in our asses. And it's come to the web. It's been lurking in the corners of the web for a long time. But now it's being standardized at the World Wide Web Consortium (W3C) to something called Encrypted Media Extensions (EME). The idea of EME is that there is conduct that users want to engage in that no legislature in the world has banned, like PVR'ing their Netflix videos. But there are companies that would prefer that conduct not to be allowed. By wrapping the video with just enough DRM to invoke the DMCA, you can convert your commercial preference to not have PVRs (which are no more and no less legal than the VCR was when in 1984 the Supreme Court said you can record video off your TV) into something with the force of law, whose enforcement you can outsource to national governments.
What that means, is that if you want to do interoperability without permission, if you want to do adversarial interoperability, if you want to add a feature that the manufacturer or the value chain doesn't want, if you want to encapsulate Gopher inside of the Web to launch a web browser with content form the first day, if you want to add an abstraction layer that lets you interoperate between two different video products so that you can shop between them and find out which one has the better deal, that conduct, which has never been banned by a legislature, becomes radioactively illegal.
It also means, that if you want to implement something that users can modify, you will find yourself at the sharp end of the law, because user modifiability for the core components of the system is antithetical to its goals of controlling user conduct. If there's a bit you can toggle that says "Turn DRM off now", then if you turn that bit off, the entire system ceases to work. But the worst part of all is that it makes browsers into no-go zones for security disclosures about vulnerabilities in the browser, because if you know about a vulnerability you could use it to weaken EME. But you could also use it to attack the user in other ways.
Adding DRM to browsers, standardizing DRM as an open standards organization, that's a compromise. It's a little compromise, because after all there's already DRM in the world, and it's a compromise that's rational if you believe that DRM is inevitable. If you think that the choice is between DRM that's fragmented or DRM that we get a say in, that we get to nudge into a better position, then it's the right decision to make. You get to stick around and do something to make it less screwed up later, as opposed to being self-marginalized by refusing to participate at all.
But if DRM is inevitable, and I refuse to believe that it is, it's because individually, all across the world, people who started out with the best of intentions made a million tiny compromises that took us to the point where DRM became inevitable, where the computers that are woven into our lives, with increasing intimacy and urgency, are designed to control us instead of being controlled by us. And the reasons those compromises were made is because each one of us thought that we were alone and that no one would have our back, that if we refuse to make the compromise, the next person down the road would, and that eventually, this would end up being implemented, so why not be the one who makes the compromise now.
They were good people, those who made those compromises. They were people who were no worse than you and probably better than me. They were acting unselfishly. They were trying to preserve the jobs and livelihoods and projects of people that they cared about. People who believed that others would not back their play, that doing the right thing would be self-limiting. When we're alone, and when we believe we're alone, we're weak.
It's not unusual to abuse standards bodies to attain some commercial goal. The normal practice is to get standards bodies to incorporate your patents into a standard, to ensure that if someone implements your standard, you get a nickel every time it ships. And that's a great way to make rent off of something that becomes very popular. But the W3C was not armtwisted about adding patents back into standards. That's because the W3C has the very best patents policy of any standards body in the world. When you come to the W3C to make a standard for the web, you promise not to use your patents against people who implement that standard. And the W3C was able to make that policy at a moment in which it was ascendant, in which people were clamoring to join it, in which it was the first moments of the Web and in which they were fresh.
The night they went on a diet, they were able to throw away all the Oreos in the house. They were where you are now, starting a project that people around the world were getting excited about, that was showing up on the front page of the New York Times. Now that policy has become the ironclad signifier of the W3C. What's the W3C? It's the open standards body that's so open, that you don't get to assert patents if you join it. And it remains intact.
How will we keep the DMCA from colonizing the Locked Open Web? How will we keep DRM from affecting all of us? By promising to have each others' backs. By promising that by participating in the Open Web, we take the DMCA off the table. We take silencing security researchers, we take blocking new entrances to the market off the table now, when we are fresh, when we are insurgent, before we have turned from the pirates that we started out as into the admirals that some of us will become. We take that option off the table.
The EFF has proposed a version of this at the W3C and at other bodies, where we say: To be a member, you have to promise not to use the DMCA to aggress against those, who report security vulnerabilities in W3C standards, and people who make interoperable implementations of W3C standards. We've also proposed that to the FDA, as a condition of getting approval for medical implants, we've asked them to make companies promise in a binding way never to use the DMCA to aggress against security researchers. We've taken it to the FCC, and we're taking it elsewhere. If you want to sign an open letter to the W3C endorsing this, email me: cory@eff.org
But we can go further than that, because Ulysses pacts are fantastically useful tools for locking stuff open. It's not just the paper that you sign when you start your job, that takes a little bit of money out of your bank account every month for your 401k, although that works, too. The U.S. constitution is a Ulysses pact. It understands that lawmakers will be corrupted and it establishes a principal basis for repealing the laws that are inconsistent with the founding principles as well as a process for revising those principles as need be.
A society of laws is a lot harder to make work than a society of code or a society of people. If all you need to do is find someone who's smart and kind and ask them to make all your decisions for you, you will spend a lot less time in meetings and a lot more time writing code. You won't have to wrangle and flame or talk to lawyers. But it fails badly. We are all of us a mix of short-sighted and long-term, depending on the moment, our optimism, our urgency, our blood-sugar levels…
We must give each other moral support. Literal moral support, to uphold the morals of the Decentralized Web, by agreeing now what an open internet is and locking it open. When we do that, if we create binding agreements to take certain kinds of conduct off the table for anything that interoperates with or is part of what we're building today, then our wise leaders tomorrow will never be pressurized to make those compromises, because if the compromise can't be made, there is no point in leaning on them to make it.
We must set agreements and principles that allow us to resist the song of the Sirens in the future moments of desperation. And I want to propose two key principles, as foundational as life, liberty, and the pursuit of happiness or the First Amendment:
1) When a computer receives conflicting instructions from its owner and from a remote party, the owner always wins.
Systems should always be designed so that their owners can override remote instructions and should never be designed so that remote instructions can be executed if the owner objects to them. Once you create the capacity for remote parties to override the owners of computers, you set the stage for terrible things to come. Any time there is a power imbalance, expect the landlord, the teacher, the parent of the queer kid to enforce that power imbalance to allow them to remotely control the device that the person they have power over uses.
You will create security risks, because as soon as you have a mechanism that hides from the user, to run code on the user's computers, anyone who hijacks that mechanism, either by presenting a secret warrant or by breaking into a vulnerability in the system, will be running in a privileged mode that is designed not to be interdicted by the user.
If you want to make sure that people show up at the door of the Distributed Web asking for backdoors, to the end of time, just build in an update mechanism that the user can't stop. If you want to stop those backdoor requests from coming in, build in binary transparency, so that any time an update ships to one user that's materially different from the other ones, everybody gets notified and your business never sells another product. Your board of directors will never pressurize you to go along with the NSA or the Chinese secret police to add a backdoor, if doing so will immediately shut down your business.
Throw away the Oreos now.
Let's also talk about the Computer Fraud and Abuse Act. This is the act that says if you exceed your authorization on someone else's computer, where that authorization can be defined as simply the terms of service that you click through on your way into using a common service, you commit a felony and can go to jail. Let's throw that away, because it's being used routinely to shut down people who discover security vulnerabilities in systems.
2) Disclosing true facts about the security of systems that we rely upon should never, ever be illegal.
We can have normative ways and persuasive ways of stopping people from disclosing recklessly, we can pay them bug bounties, we can have codes of conduct. But we must never, ever give corporations or the state the legal power to silence people who know true things about the systems we entrust our lives, safety, and privacy to.
These are the foundational principles. Computers obey their owners, true facts about risks to users are always legal to talk about. And I charge you to be hardliners on these principles, to be called fanatics. If they are not calling you puritans for these principles you are not pushing hard enough. If you computerize the world, and you don't safeguard the users of computers form coercive control, history will not remember you as the heroes of progress, but as the blind handmaidens of future tyranny.
This internet, this distributed internet that we are building, the Redecentralization of the Internet, if it ever succeeds, will someday fail, because everything fails, because overwhelmingly, things are impermanent. What it gives rise to next, is a function of what we make today. There's a parable about this:
The state of Roman metallurgy in the era of chariots, determined the wheel base of a Roman chariot, which determined the width of the Roman road, which determined the width of the contemporary road, because they were built atop the ruins of the Roman roads, which determined the wheel base of cars, which determined the widest size that you could have for a container that can move from a ship, to a truck, to a train, which determined the size of a train car, which determined the maximum size of the Space Shuttle's disposable rockets.
Roman metallurgy prefigured the size of the Space Shuttle's rockets.
This is not entirely true, there are historians who will explain the glosses in which it's not true. But it is a parable about what happens when empires fall. Empires always fall. If you build a glorious empire, a good empire, an empire we can all be proud to live in, it will someday fall. You cannot lock it open forever. The best you can hope for is to wedge it open until it falls, and to leave behind the materials, the infrastructure that the people who reboot the civilization that comes after ours will use to make a better world.
A legacy of technology, norms and skills that embrace fairness, freedom, openness and transparency, is a commitment to care about your shared destiny with every person alive today and all the people who will live in the future.
Cory Doctorow: "How Stupid Laws and Benevolent Dictators can Ruin the Decentralized Web, too"
[Transcript by Jonke Suhr]