Amazon has been quietly selling its facial recognition system to US police forces, marketing it for bodycam use

Amazon bills its Rekognition image classification system as a "deep learning-based image and video analysis" system; it markets the system to US police forces for use in analyzing security camera footage, including feeds from police officers' bodycams.

The marketing materials circulated to law enforcement touted Rekognition's ability to identify up to 100 individuals in a single photo. Amazon bound the cities it pitched with nondisclosure agreements, and cities have cited these NDAs in denying public records requests for details about their plans to use Rekognition.


Amazon's law enforcement material suggests that its tool could be use to identify "persons of interest" — not wanted criminals or even rehabilitated felons, but (for example) protesters and activists that police intelligence units have decided to target for continuous scrutiny.

Amazon developed its law-enforcement marketing through cooperation with the city of Orlando, Florida, where it created a "proof of concept trial." It subsequently developed a network of city procurement officials and encouraged its existing customers to help the company pitch new business in other cities.

Amazon's public list of municipal/law enforcement surveillance customers includes Orlando, and the Washington County Sheriff's Office in Oregon.


With Rekognition, a government can now build a system to automate the identification and tracking of anyone. If police body cameras, for example, were outfitted with facial recognition, devices intended for officer transparency and accountability would further transform into surveillance machines aimed at the public. With this technology, police would be able to determine who attends protests. ICE could seek to continuously monitor immigrants as they embark on new lives. Cities might routinely track their own residents, whether they have reason to suspect criminal activity or not. As with other surveillance technologies, these systems are certain to be disproportionately aimed at minority communities.

Because of Rekognition's capacity for abuse, we asked Washington County and Orlando for any records showing that their communities had been provided an opportunity to discuss the service before its acquisition. We also asked them about rules governing how the powerful surveillance system could be used and ensuring rights would be protected. Neither locality identified such records. In fact, Washington County began using Rekognition even as employees raised questions internally. In one email, a Washington County employee expressed the concern that the "ACLU might consider this the government getting in bed with big data." That employee's prediction was correct.

People should be free to walk down the street without being watched by the government. By automating mass surveillance, facial recognition systems like Rekognition threaten this freedom, posing a particular threat to communities already unjustly targeted in the current political climate. Once powerful surveillance systems like these are built and deployed, the harm will be extremely difficult to undo.


Amazon Teams Up With Law Enforcement to Deploy Dangerous New Facial Recognition Technology
[Matt Cagle and Nicole Ozer/ACLU]


(Image: Cryteria, CC-BY)