The young journalists at YR Media (formerly Youth Radio) were curious about "what artificial intelligence means for race, art, and the apocalypse." So they asked the opinion of a a few experts, including tech journalist Alexis Madrigal, engineer Deb Raji of New York University's AI Now Institute, artist/programmer Sam Lavigne, and AI ethicisit Rachel Thomas. You can read (and listen to) bit from the lively conversation at the Youth Media feature "In the Black Mirror." Here's an excerpt:
Read the rest
RACE + BIAS
Deb Raji: There was a study released where we evaluated the commercial facial recognition systems that were deployed. And we said, "How well does this system work for different intersectional demographics?" So, how well does it work for darker skinned woman versus lighter skinned woman versus darker skinned men and lighter skinned men? And it figures that there was a 30 percent performance gap between lighter skinned men and darker skinned men, which is insane. For reference, usually you don't deploy a system that's performing at less than 95 percent accuracy.
Rachel Thomas: Another example of bias comes from some software that's used in many U.S. courtrooms. It gives people a rating of how likely they are to commit another crime. And it was found that this software has twice as high a false positive rate on black defendants compared to white defendants. So that means it was predicting that people were high risk even though they were not being rearrested. And so this is something that's really impacting people's lives because it was being used in sentencing decisions and bail decisions.