Apple pauses child porn surveillance plan

Apple announced that it planned to scan images on iPhones to see if they matched the signatures of known child abuse images. Though iCloud mail (and many cloud storage services) are already subject to such scanning, the fact that the scanning would happen on users' own phones alarmed users and privacy groups. Even if the implementation was limited, it was clear the policy was made under political pressure and would open the door to more on-device surveillance. Apple seemed contemptuous of the backlash at first, quoting an activist partner who described it as "the screeching voices of a minority" in an internal memo. But with its own employees joining the "screeching" and the company's hard-earned reputation for user privacy burning to the ground, it has for now paused its plans.

Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.