Amazon shareholders to vote on proposal to stop selling racially biased facial surveillance software to governments

"BIG DEAL," says the ACLU's Matt Cagle about this story. "Amazon shareholders will vote on whether the Board must reconsider company sales of face surveillance to governments. The SEC rejected Amazon's attempt to prevent this proposal from moving forward."

Amazon's facial recognition technology is branded 'Rekognition.'

The fact that Amazon shareholders now get to vote on this *is* a big deal. Remember also that Amazon is one of the companies identified as providing services to Trump's "lock up all the brown people in cages" Department of Homeland Security and ICE and the Border Patrol — as we understand it, AWS helps with the databases agents use to identify the detained men, women, and children.

From, an article posted a few days ago on April 4 when SEC said it would not allow Amazon executives to stop the proposal:

The SEC's ruling comes amidst mounting criticism of the Amazon technology, "Rekognition," as racially biased. Only yesterday (4/3), the New York Times reported that at least 25 prominent artificial-intelligence researchers, including experts at Google, Facebook and Microsoft, have signed a letter calling on Amazon to stop selling its facial-recognition technology to law enforcement agencies because it is biased against women and people of color.

The Times said the letter "reflects growing concern in academia and the tech industry that bias in facial-recognition technology is a systemic problem. Some researchers — and even some companies — are arguing the technology cannot be properly controlled without government regulation."

The two shareholder resolutions, which were filed with Amazon in December, focus on the business risks to the company from sales of Rekognition. One resolution asks Amazon to halt sales of Rekognition to government unless the board "concludes the technology does not pose actual or potential civil and human rights risk;" the other resolution requests the board commission an independent study of Rekognition regarding the extent to which the technology may "endanger, threaten, or violate" privacy or civil rights.

Read the rest: