Ever, an "unlimited photo storage app," secretly fed its users' photos to a face-recognition system pitched to military customers UPDATE

Update: I've been emailed twice by Ever PR person Doug Aley, who incorrectly claimed that Ever's signup notice informed users that their data was going to be used to train an AI that would be marketed for military applications. It's true that during the signup process, users are asked whether they want to "use" facial recognition (that is, to label their images), but not whether they consent to having their images used to train that system, and especially not for commercial military applications.

Ever is an app that promises that you can "capture your memories" with unlimited photo storage, with sample albums featuring sentimental photos of grandparents and their grandkids; but Ever's parent company has another product, Ever AI, a facial recognition system pitched at military agencies to conduct population-scale surveillance. Though Ever's users' photos were used to train Ever AI, Ever AI's sales material omits this fact — and the only way for Ever users to discover that their photos have become AI training data is to plough through a 2,500 "privacy policy."


Ever AI launched after the company realized that photo hosting "wasn't going to be a venture-scale business," and the switch to Ever AI brought in $16 million in venture capital.

Ever says that it has not deceived its users because the facial recognition disclosure is in its privacy policy; the company also told NBC that it would revise the policy to make the facial recognition applications clearer, while insisting that it was already clear enough. NBC spoke to mulitple Ever users who had no idea their photos had been used to train a facial recognition system, and who strenuously objected to this use.


Despite targeting Ever AI at the military, the company has only managed to sign private customers to date, including the makers of Pepper, the creepy "customer service robot."

Cities across America are debating a ban on the use of facial recognition for law enforcement; recognizing that these systems are both inaccurate and unconstitutional, subjecting whole populations to surveillance without particularized suspicion, in defiance of the presumption of innocence.

The Ever story is a parable about the reason that companies should be held to account for their terms of service. Companies like smart-lock vendor Latch say that they never plan on using the surveillance data their terms of service give them the right to collect, but these companies are venture backed and if they don't return hockey-stick growth, they will either lose their capital or have their founders forced out and replaced with execs who are under orders to use any means necessary to secure the returns the investors were betting on. If the company has the right on paper to destroy your life by selling off your private data, you should assume that someday, someone will be running that company who will do just that.

Terms of service are overbroad because companies' investors want to maintain the flexibillity to abuse them later. It's similar to investors' insistence that companies file overbroad, bullshitty software patents, which the founders insist are only ornamental flourishes to please their backers (or defensive weapons to protect them against patent trolls). But the real reason investors like shitty patents is because there is a vibrant market for low-quality patents that can be weaponized by patent trolls, and since most venture-backed companies fail, these patents represent a way to recoup the likely losses from tech investments.

NBC News spoke to seven Ever users, and most said they were unaware their photos were being used to develop face-recognition technology.

Sarah Puchinsky-Roxey, 22, from Lemoore, California, used an expletive when told by phone of the company's facial recognition business. "I was not aware of any facial recognition in the Ever app," Roxey, a photographer, later emailed, noting that she had used the app for several years. "Which is kind of creepy since I have pictures of both my children on there as well as friends that have never consented to this type of thing."

She said that she found the company's practices to be "invasive" and has now deleted the app.

Millions of people uploaded photos to the Ever app. Then the company used them to develop facial recognition tools. [Olivia Solon/NBC]


(via JWZ)