Using distributed code-signatures to make it much harder to order secret backdoors

Cothority is a new software project that uses "multi-party cryptographic signatures" to make it infinitely harder for governments to order companies to ship secret, targeted backdoors to their products as innocuous-looking software updates.

It's a form of "software transparency," whereby the existence of any given update becomes much harder to keep secret. Cothority would change the way companies like Apple sign their code — rather than requiring a single signature from Apple to validate code before a device was willing to install it, the Cothority system requires that Apple's signature be accompanied by a quorum of signatures from third parties in multiple jurisdictions, attesting that they were asked by Apple to sign an update.

If there were, say, 8000 potential signatories, of whom 4000 were required to sign an update before a device trusted it, then a government that tried to pressure Apple into keeping the existence of a targeted backdoor secret would also have to get 4000 other people, organizations or companies to also keep a secret. If those entities leaked the fact that they were asked to sign an update that never appeared for most users, then it would be apparent that Apple had targeted an update to a small number of users — itself a strong indicator that they'd made a backdoor.

It's a bit of game-theory. As I've written before, the point of this kind of thing is to keep governments from even trying to put secret pressure on tech companies, because the system is set up so that the secret immediately gets out. Economists call it a "Ulysses pact," named for Ulysses' tactic of lashing himself to the mast when his ship passed by the sirens, so their songs couldn't lure him into jumping overboard — by (literally) tying his hands, he entered into a situation knowing that certain actions were off the table. This is used all the time in negotiating — for example, a union negotiator may say, "I'll resign before I accept rollbacks on pensions." The management rep can bluster all they want about rollbacks, but the negotiator can say, "Sorry, if it's rollbacks or nothing, then I have to quit and you'll have to wait until a new negotiator is chosen. I literally can't sign a deal with rollbacks in it."

Using Cothority means trading short bursts of inconvenience (having to muster a quorum every time you want to ship an update) off against the long-term, terrible pain of fighting a state-level actor who tries to use secret orders to force you to do something that, if it became public, could wound or even kill your business off.

Companies in general are pretty bad at making bets against long-term pain, and public companies (focused on quarterly earnings statements) are even worse. That fact is behind the climate crisis, pollution, bad labor practices, Dieselgate, and so many of our other heartaches. Nevertheless, the existence of a tool changes the facts on the ground: once tools like Cothority exist, then the decision not to use them becomes, in effect, a sign saying "We're open for business when it comes to secret wiretap orders."

Note that Cothority would do nothing in the current Apple v FBI mess. In that case, Apple is being ordered to publicly produce a signed update to help backdoor a device.

Not everyone agrees that Cothority will solve the problem of government-ordered backdoors, though. "Ultimately it's a hurdle that our legal system could still abuse," Jonathan Ździarski, an iOS forensics expert, told Ars. "There are plenty of cases where false witness has been given in the real world. Even worse, the idea would give a false sense of security to people that the system was not rigged, when indeed it can most certainly still be rigged, so it reinforces a system that could in fact be broken."

Ford acknowledged as much. Cothority does not make it impossible for a powerful adversary to compel Apple, or another software maker, to issue a backdoored software update in secret, Ford said, but does make it much more difficult. A nation-state attacker could, in theory, bribe thousands of witnesses, or coerce them to sign a targeted software update in secret. Or such an attacker could hack those witnesses' computers and issue fake signatures.

Given the declining half-life of secrets, though, it seems likely that any such coercion, bribery, or hacking would eventually come to light—defeating the point of doing so in the first place.

Ford also points out that Cothority can't defend against a "bug door" slipped into iOS by, say, an undercover NSA employee working for Apple. Nor can it prevent the government from coercing Apple to backdoor all iOS devices.

Cothority to Apple: Let's make secret backdoors impossible
[JM Porup/Ars Technica]

Apple, FBI, and Software Transparency
[Bryan Ford/Freedom to Tinker]

Keeping Authorities "Honest or Bust" with Decentralized Witness Cosigning

[Ewa Syta, Iulia Tamas, Dylan Visher, David Isaac Wolinsky, Linus Gasser, Nicolas Gailly, and Bryan Ford/37th IEEE Symposium on Security and Privacy]