War on General Purpose Computers is the difference between utopia and dystopia

My Wired op-ed, How Laws Restricting Tech Actually Expose Us to Greater Harm, warns that we've learned the wrong lesson from the DRM wars: we've legitimized the idea that we can and should design computers to disobey their owners and hide their operations from them in order to solve our problems (and that we should protect this design decision by making it a felony to disclose flaws in devices, lest these flaws be used to jailbreak them).

This was bad enough when it was DVD players that refused to skip ads, but once we extend this model to our autonomous vehicles, medical implants, and home-automation systems, we set up a situation where spies, crooks, cops and snoops can do unlimited harm to us, and where we're not allowed to do anything to protect ourselves.

If those million-eyed, fast-moving, deep-seated computers are designed to obey their owners; if the policy regulating those computers encourages disclosure of flaws, even if they can be exploited by spies, criminals, and cops; if we're allowed to know how they're configured and permitted to reconfigure them without being overridden by a distant party—then we may enter a science fictional world of unparalleled leisure and excitement.

But if the world's governments continue to insist that wiretapping capacity must be built into every computer; if the state of California continues to insist that cell phones have kill switches allowing remote instructions to be executed on your phone that you can't countermand or even know about; if the entertainment industry continues to insist that the general-purpose computer must be neutered so you can't use it to watch TV the wrong way; if the World Wide Web Consortium continues to infect the core standards of the web itself to allow remote control over your computer against your wishes—then we are in deep, deep trouble.

The Internet isn't just the world's most perfect video-on-demand service. It's not simply a better way to get pornography. It's not merely a tool for planning terrorist attacks. Those are only use cases for the net; what the net is, is the nervous system of the 21st century. It's time we started acting like it.

How Laws Restricting Tech Actually Expose Us to Greater Harm

(Image: Matt Dorfman)

Notable Replies

  1. Making the case to us doesn't really help, because Boing Boing has limited clout. What I suspect needs to happen is EFF (or somebody like them) needs to put together a kind of counter-ALEC to lobby congress. Since Citizens United, nothing matters except lobbying dollars in the US. If a trade group got on board and could leverage the lobbying dollars of companies like Google and Microsoft and whoever else stands to gain by unfettered computing, there will be a voice to counter the voice of the MPAA and their ilk.

  2. We need a combined approach, swarming the problem from multiple sides. The lawyer types to argue the policymakers, the PR types to inform the masses, the tech types to just go the civil disobedience way.

    Most important, from my position, we need tools to facilitate easy reverse engineering of large opaque binary blobs. Disassemblers with facilities for visualisation of large chunks of code, to easily get the hang of the structure of its execution flow, to find that critical conditional jump instruction to be replaced with a no-operation or unconditional jump one. To find places where a call to a "friendly" subroutine can be injected.

    Because the possible/impossible is a step more important than legal/illegal; the latter is only a subset of the possible, as the impossible cannot be done even if legal. Tools that make impossible possible are therefore more important than law tools that make illegal legal - not that the latter aren't important at all, though!

    The tech is ours. We just have to actually realize it, and take it back. By the force of our brains, if necessary.

    Remember this companion image for the article's one:

  3. j127 says:

    If people want control over their technology, they should be fighting for root access on all devices by default. I don't want to risk bricking my phone or tablet to get root access.

    It should be easy for me to install software that gives me fine grained privacy controls over apps so that I don't have to share my entire web browsing history and online life with a company just to read their content.

    App stores are a scheme to limit software distribution. This trend needs also to disappear.

    P.S., why does Boing Boing BBS need access to my Twitter private messages for me to post on the board via Twitter login?

  4. That feeling of rush of power, that's hard to beat! :smiley:

    On the contrary, I consider the blob understanding to be of higher importance. High-end adversaries will be able to forge the certs, which weakens the importance for security (though it can still somewhat hold against the lower-end threats - but these will likely find other ways, most likely via exploiting a social-engineering vulnerability in the user).

    Knowing that a piece of bad code comes from Sony, by signature, is of little use in comparison with the ability to detect and analyze the code's bad behavior. Attribution can be done by alternative mechanisms, e.g. by comparing hashes of files obtained from a known source (ideally a CD, or a download from a known-good server, by many other people), if something interesting in the code shows it should be done.

    At the same time, understanding the code and ability to alter it gives you more power over your own machine. If the code contains something wrong, whether an intentional exploit or a restriction that you are not comfortable with (the latter I consider to be a much more important for us all), you first have to find it. For that you need to visualize the code somehow (a small screen full of tiny instructions takes too much time to understand the structure, though is vital in the final phases of the attack). Then, once you localized the code filament that you want to force or eliminate, you can do so. However, then you destroy the signature - so you have to have your own mechanism for local signing, or to disable enforcement of signatures.

    An advanced add-on for the visualisation system could be a binary comparison. Decompile two blobs into code threads, highlight the sequences whose meaning differs (we're dealing with compiler optimizations here that can cause syntactic but not semantic differences). That would allow seeing if an update is not trying to sneak in something stinky, without having to analyze the code we already looked through in the previous version.

    The problem here is, a good verification can also be used for usage restrictions, if integrated too tightly within the system. I would prefer ability to boot the device from a trusted system (we're in the verification problematic again, but that can be done via hashes) and then do the checks. That gives you the ability to run whatever you want while keeping track of changes done, whether by the owner, by an adversary, or by errors.

    TPM is good as long as we, the possessors of the machines (regardless what the Lawyers say about "ownership", to avoid loopholes with renting-only), have the full control over the module. Which, I am afraid, may not always be the case, due to rich and powerful interests pushing this way. A nice compromise could be an open TPM module that would look like an original one, but allow access to its firmware and the protected storage by a connector on its back. (And allow verification of the data via the same connector, which can be e.g. JTAG.) By making it look like the original we would allow the user-accessibility even for the software that wants to deny this to us, by making it think it interacts with the kind of the chip that is not supposed to let us in.

    Various code signing mechanisms are also already abused in the field of firmware updates, making jailbreaks somewhat annoying and in some cases too difficult to bother. :frowning:

  5. 100% agree, and I think that is a nuance the original article from @doctorow contained. I wish I knew any new perspective I didn't regurgitate.

    Anyways, static analysis tools are lacking and if you figured out a way of deterministically outputting a graph that a of what an app will do that a CS masters grad can understand, I gaurentee you will be a billionaire. (Please do it, I seriously need a toolchain like this (PLEASE!!))

Continue the discussion bbs.boingboing.net

10 more replies