Apple will scan iPhones for known sexually explicit images of children and for messages to children that contain sexually explicit language. The move pleases child protection groups, but the privacy ramifications alarm everyone else—and undermine the company's efforts to present itself as a privacy-conscious alternative to Google and Facebook. The Electronic Frontier Foundation reports that it cannot be accomplished without backdoors that governments and hackers will inevitably exploit.
Apple is planning to build a backdoor into its data storage system and its messaging system.
Child exploitation is a serious problem, and Apple isn't the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.
To say that we are disappointed by Apple's plans is an understatement. Apple has historically been a champion of end-to-end encryption, for all of the same reasons that EFF has articulated time and time again. Apple's compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company's leadership in privacy and security.
The specific database of image fingerprints used here cannot be used for other things, but once the policy precedent is set courts and politicians will demand it on principle and the tech will follow. Steven Murdoch, a Professor of Security Engineering at UCL, writes that British ISPs were made to do this a few years ago for child-abuse images, but now pictures of counterfeit watches are now up for the scanning—a perfect example of the kind of trivial bullshit surveillance that all such noble intents collapse into.