In 2015, a black software developer named Jacky Alciné revealed that the image classifier used by Google Photos was labeling black people as "gorillas."
Google apologized profusely and set to work on the bug. Two years later, Google has simply erased gorillas (and, it seems, chimps and monkeys) from the lexicon of labels its image classifier can apply or be searched with, rendering these animals unsearchable and in some sense invisible to the AI that powers Google's image searching capabilities.
The capability to classify images as containing gorillas remains in some Google products, like Cloud Vision API.
A Google spokesperson confirmed that "gorilla" was censored from searches and image tags after the 2015 incident, and that "chimp," "chimpanzee," and "monkey" are also blocked today. "Image labeling technology is still early and unfortunately it's nowhere near perfect," the spokesperson wrote in an email, highlighting a feature of Google Photos that allows users to report mistakes.Google's caution around images of gorillas illustrates a shortcoming of existing machine-learning technology. With enough data and computing power, software can be trained to categorize images or transcribe speech to a high level of accuracy. But it can't easily go beyond the experience of that training. And even the very best algorithms lack the ability to use common sense, or abstract concepts, to refine their interpretation of the world as humans do.
When It Comes to Gorillas, Google Photos Remains Blind [Tom Simonite/Wired]