Facebook is not responsible for bad speech by its users — section 230 of the US Telecommunications Act says that libel and other forms of prohibited speech are the responsibility of users, not those who provide forums for users to communicate in — but it takes voluntary steps to try to keep its service from being a hostile environment for its users, paying 4,500 moderators to delete material the company deems unacceptable.
But Facebook has refused to articulate the rules for speech on the service; moderators make choices according to secret criteria, deleting material and suspending Facebook users' accounts without the users being told what the line is and how they crossed it.
Now a set of internal training documents "obtained by Propublica" offer the first look at what Facebook tells its employees to censor and what to leave intact. The overarching principle is that speech that condemns "protected categories" (race, sex, gender identity, religious affiliation, national origin, ethnicity, sexual orientation and serious disability/disease) is banned, but speech that criticizes subsets of these groups is permitted.
So one may deny the Holocaust, but only if you do so without engaging in anti-Semitism.
In practice, that's a hard line to walk. But there are plenty of lines that are easier to toe: Facebook allows its users to make blanket condemnations of "black children" (because children are not a protected category, and "black children" are a "subset" of all black people). But condemning "white men" is banned, because both race and sex are protected categories.
The problem is not in the desire to protect people from racist attacks, but in failing to take power into account when enforcing the policy — a failure to recognize the difference between bigotry (disliking someone based on their race, sex, etc) and systemic racism (when one group uses its power to oppress another). Racism is bigoted, but not all bigotry is part of systemic racism.
Facebook users who don't mince words in criticizing racism and police killings of racial minorities say that their posts are often taken down. Two years ago, Stacey Patton, a journalism professor at historically black Morgan State University in Baltimore, posed a provocative question on her Facebook page. She asked why "it's not a crime when White freelance vigilantes and agents of 'the state' are serial killers of unarmed Black people, but when Black people kill each other then we are 'animals' or 'criminals.'"
Although it doesn't appear to violate Facebook's policies against hate speech, her post was immediately removed, and her account was disabled for three days. Facebook didn't tell her why. "My posts get deleted about once a month," said Patton, who often writes about racial issues. She said she also is frequently put in Facebook "jail" — locked out of her account for a period of time after a posting that breaks the rules.
"It's such emotional violence," Patton said. "Particularly as a black person, we're always having these discussions about mass incarceration, and then here's this fiber-optic space where you can express yourself. Then you say something that some anonymous person doesn't like and then you're in 'jail.'"
Didi Delgado, whose post stating that "white people are racist" was deleted, has been banned from Facebook so often that she has set up an account on another service called Patreon, where she posts the content that Facebook suppressed. In May, she deplored the increasingly common Facebook censorship of black activists in an article for Medium titled "Mark Zuckerberg Hates Black People."
Facebook also locked out Leslie Mac, a Michigan resident who runs a service called SafetyPinBox where subscribers contribute financially to "the fight for black liberation," according to her site. Her offense was writing a post stating "White folks. When racism happens in public — YOUR SILENCE IS VIOLENCE."
Facebook's Secret Censorship Rules Protect White Men from Hate Speech But Not Black Children
[Julia Angwin and Hannes Grassegger/Propublica]