Amazon secretly planned to use facial recognition and Ring doorbells to create neighborhood "watch lists"

Ring is Amazon's surveillance doorbell division, and a big part of their sales strategy involves terrifying people about the possibility of crime, partnering with police to assist in terrorizing Ring owners, and to provide police with warrantless, permanent, shareable access to surveillance doorbell footage (something the company has repeatedly lied about). Hundred of police departments have now partnered with Ring and they act as buzz-marketing teams for the company in exchange for freebies and access.


From the earliest days, it's been rumored that Ring's strategy included facial recognition (Amazon has a giant facial recognition division called "Rekognition"). Amazon denied this even as they advertised for and hired a head of facial recognition research for Ring.

Now, a leak reported by The Intercept reveals that Amazon once had a secret plan to use Ring cameras and facial recognition to automatically compile a "watch list" of neighborhood undesirables whose presence trigger alerts to Ring owners. The blacklists would be distributed through Amazon/Ring's "Neighbors" app, which is currently a dumpster fire of racist white people sharing alarmed messages about brown people their surveillance doorbells recorded in their neighborhoods.

It's not entirely clear how the "watch lists" would be compiled, but the leaked documents describe using AI to identify "suspicious activity," which is not something that machine learning systems can actually do, but which many vendors claim they can.


According to the Ring documents reviewed by The Intercept, which have not been previously reported, the company planned a string of potentially invasive new surveillance features for its product line, of which the facial recognition-based watch-list system is one part.

In addition to the facial watch lists, Ring has also worked on a so-called suspicious activity prompt feature that would alert users via in-app phone notification when a "suspicious" individual appears near their property's video feeds. In one document, this feature is illustrated with a mockup of a screen in the Neighbors app, showing a shabbily dressed man walking past a Ring owner's garage-mounted camera. "Suspicious Activity Suspected," warns the app. "This person appears to be acting suspicious. We suggest alerting your neighbors." The app then offers a large "Notify Neighbors" button. The document leaves how exactly "suspicious" is defined a mystery.

A third potentially invasive feature referenced in the Ring documents is the addition of a "proactive suspect matching" feature, described in a manner that strongly suggests the ability to automatically identify people suspected of criminal behavior — again, whether by police, Ring customers, or both is unclear — based on algorithmically monitored home surveillance footage. Ring is already very much in the business of providing — with a degree of customer consent — valuable, extrajudicial information to police through its police portal. A "proactive" approach to information sharing could mean flagging someone who happens to cross into a Ring video camera's frame based on some cross-referenced list of "suspects," however defined. Paired with the reference to a facial recognition watch list and Ring's generally cozy relationship with local police departments across the country, it's easy to imagine a system in which individuals are arbitrarily profiled, tracked, and silently reported upon based on a system owned and operated solely by Amazon, without legal recourse or any semblance of due process. Here, says Tajsar, "Ring appears to be contemplating a future where police departments can commandeer the technology of private consumers to match 'suspect' profiles of individuals captured by private cameras with those cops have identified as suspect — in fact, exponentially expanding their surveillance capabilities without spending a dime."

Amazon's Ring Planned Neighborhood "Watch Lists" Built on Facial Recognition [Sam Biddle/The Intercept]

(Image: Cryteria, CC BY)