Teenagers interview A.I experts about the future of thinking machines

The young journalists at YR Media (formerly Youth Radio) were curious about "what artificial intelligence means for race, art, and the apocalypse." So they asked the opinion of a a few experts, including tech journalist Alexis Madrigal, engineer Deb Raji of New York University's AI Now Institute, artist/programmer Sam Lavigne, and AI ethicisit Rachel Thomas. You can read (and listen to) bit from the lively conversation at the Youth Media feature "In the Black Mirror." Here's an excerpt:


Deb Raji: There was a study released where we evaluated the commercial facial recognition systems that were deployed. And we said, "How well does this system work for different intersectional demographics?" So, how well does it work for darker skinned woman versus lighter skinned woman versus darker skinned men and lighter skinned men? And it figures that there was a 30 percent performance gap between lighter skinned men and darker skinned men, which is insane. For reference, usually you don't deploy a system that's performing at less than 95 percent accuracy.

Rachel Thomas: Another example of bias comes from some software that's used in many U.S. courtrooms. It gives people a rating of how likely they are to commit another crime. And it was found that this software has twice as high a false positive rate on black defendants compared to white defendants. So that means it was predicting that people were high risk even though they were not being rearrested. And so this is something that's really impacting people's lives because it was being used in sentencing decisions and bail decisions.

Read the rest

Second wave Algorithmic Accountability: from "What should algorithms do?" to "Should we use an algorithm?"

For ten years, activists and theorists have been developing a critique of "algorithms" (which have undergone numerous renamings over the same time, e.g. "filter bubbles"), with the early critiques focusing on the way that these can misfire with dreadful (or sometimes humorous) consequences, from discrimination in which employment and financial ads get served to the "dark patterns" that "maximized engagement" with services that occupied your attention but didn't bring you pleasure. Read the rest