Apple's letter explaining why it won't give the FBI a backdoor to the iPhone

Apple has published a letter explaining why it's not complying with the FBI's request to develop a unique version of iOS that would make it trivially easy for the FBI to unlock an iPhone. The FBI has a terrible record of abusing its power, and Apple doesn't want to enable them:

[T]he order would set a legal precedent that would expand the powers of the government and we simply don't know where that would lead us. Should the government be allowed to order us to create other capabilities for surveillance purposes, such as recording conversations or location tracking? This would set a very dangerous precedent.

Law enforcement agents around the country have already said they have hundreds of iPhones they want Apple to unlock if the FBI wins this case. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks. Of course, Apple would do our best to protect that key, but in a world where all of our data is under constant threat, it would be relentlessly attacked by hackers and cybercriminals. As recent attacks on the IRS systems and countless other data breaches have shown, no one is immune to cyberattacks.

Again, we strongly believe the only way to guarantee that such a powerful tool isn't abused and doesn't fall into the wrong hands is to never create it.

Check out this post from James Comey, director of the FBI, which appeared on the Lawfare blog on Sunday:

We simply want the chance, with a search warrant, to try to guess the terrorist's passcode without the phone essentially self-destructing and without it taking a decade to guess correctly. That's it. We don't want to break anyone's encryption or set a master key loose on the land. I hope thoughtful people will take the time to understand that.

This is a joke. There are only a million possible combinations for a six-digit passcode. The FBI wants Apple to create a custom iOS that will allow the FBI to enter password guesses electronically, as fast as the iPhone can process them. The FBI would get the unlock code in less than 24 hours, and they know it. There is no "try."

This article from The Intercept shows you how to set up your phone to use an 11-digit passcode, which would "take up to 253 years, and on average 127 years, to crack."

From The Intercept:

Here are a few final tips to make this long-passcode thing work better:

  • Within the "Touch ID & Passcode" settings screen, make sure to turn on the Erase Data setting to erase all data on your iPhone after 10 failed passcode attempts.
  • Make sure you don't forget your passcode, or you'll lose access to all of the data on your iPhone.
  • Don't use Touch ID to unlock your phone. Your attacker doesn't need to guess your passcode if she can push your finger onto the home button to unlock it instead. (At least one court has ruled that while the police cannot compel you to disclose your passcode, they can compel you to use your fingerprint to unlock your smartphone.)
  • Don't use iCloud backups. Your attacker doesn't need to guess your passcode if she can get a copy of all the same data from Apple's server, where it's no longer protected by your passcode.
  • Do make local backups to your computer using iTunes, especially if you are worried about forgetting your iPhone passcode. You can encrypt the backups, too.

By choosing a strong passcode, the FBI shouldn't be able to unlock your encrypted phone, even if it installs a backdoored version of iOS on it. Not unless it has hundreds of years to spare.

Image: iPhone Fiasco