Moderating the internet is traumatic

Catherine Buni and Soraya Chemaly report on The Secret Rules of The Internet: the mass-scale moderation practiced by social networks.

Mora-Blanco's team — 10 people in total — was dubbed The SQUAD (Safety, Quality, and User Advocacy Department). They worked in teams of four to six, some doing day shifts and some night, reviewing videos around the clock. Their job? To protect YouTube's fledgling brand by scrubbing the site of offensive or malicious content…

"Oh, God," she said.

Mora-Blanco won't describe what she saw that morning. For everyone's sake, she says, she won't conjure the staggeringly violent images which, she recalls, involved a toddler and a dimly lit hotel room..

That YouTube's moderation system is a failure is part of internet lore. But it's still amazing to see both how much moderation YouTube does, how completely horrifying the removed content is, and how much professional consideration can go into individual decisions.

Moderation is a selective pressure that favors compatible abusiveness at the limits of whatever's permitted: typically sexism, racism, and general bigotry wedded to whatever structural qualities of the platform can be exploited to amplify and apply it. It's a culture problem that looks like a technology problem, but technical solutions often have unexpected results. And many companies hardly bother to try.