The Safe Face Pledge launched last month as a "pledge to mitigate abuse of facial analysis technology," with four themes: "Show Value for Human Life, Dignity, and Rights;" "Address Harmful Bias"; "Facilitate Transparency"; and "Embed Commitments into Business Practices" (SAFE).
The full pledge is inspirational and comprehensive, covering bias, secret and discriminatory state surveillance, risking human life, law enforcement abuse, auditing customer compliance, communicating the systems' workings, and making your legal documents (from vendor contracts to terms of service) reflective of your values.
The pledge's announcement describes how the UK's notoriously inaccurate police facial recognition systems are more likely to falsely accuse black people of being a match for a criminal than people of different ethnic or racial backgrounds.
That reminded me of something that EFF executive director Cindy Cohn described on a panel last month: Cindy pointed out that there's a danger in centering the critique of facial recognition in racial bias, because this bias is the result of the systems not being trained with enough images of racialized people. When a Chinese state facial recognition system ran into this problem, the Chinese government simply bought the driver's license database from an African client state and used it as training data, eliminating bias in the algorithm's false positive rate, by massively invading the privacy of millions of African people, and now the system is even better at tracking black people.
Commitment One: Show Value for Human Life, Dignity, and Rights
Signatories of the Safe Face Pledge agree to:
1 Do not contribute to applications that risk human life
By acknowledging that decisions that foreseeably increase the risk to human life are too dangerous for artificial intelligence, and by refraining from selling or providing facial analysis technologies to locate or identify targets in operations where lethal force may be used or is contemplated.
2 Do not facilitate secret and discriminatory government surveillance
By acknowledging the right of the public to understand whether and how facial analysis technologies are used by the government. By refraining from knowingly selling to the government any products and services that are not subject to public scrutiny, inspection, and oversight.
3 Mitigate law enforcement abuse
By acknowledging the right of the public to control whether and how facial analysis technologies are used in local, state or federal law enforcement including immigration agencies. By refraining from selling to law enforcement any products and services for purposes of enforcing criminal law unless a governing legislative body has explicitly and publicly considered all potential harms and authorized use of the technology through statute or ordinance.
4 Ensure your rules are being followed
By acknowledging a responsibility over how facial analysis technology is used and that ignoring a customer’s implementation or use of the technology can risk harming community members. By adopting internal “know your customer” policies and procedures to ensure as best you can that your products are not being used for secret government surveillance.
(via Beyond the Beyond)