Google and YouTube executives ignored warnings on toxic video content, now we're all paying the price

The epic battle inside YouTube over content moderation and recommendation algorithms is detailed brilliantly in a damning new report from Mark Bergen at Bloomberg News.

Bottom line: YouTube failed to moderate extremely harmful content on its platform, and we are all paying the price.

"Scores of people inside YouTube and Google, its owner, raised concerns about the mass of false, incendiary and toxic content that the world's largest video site surfaced and spread… Each time they got the same basic response: Don't rock the boat."

Executives sacrificed everything for engagement, allowing conspiracies and disinfo campaigns to spread, and killing proposals from workers that would have changed the technology that recommends these toxic videos, staff say.

Excerpt:

The conundrum isn't just that videos questioning the moon landing or the efficacy of vaccines are on YouTube. The massive "library," generated by users with little editorial oversight, is bound to have untrue nonsense. Instead, YouTube's problem is that it allows the nonsense to flourish. And, in some cases, through its powerful artificial intelligence system, it even provides the fuel that lets it spread.

Wojcicki and her deputies know this. In recent years, scores of people inside YouTube and Google, its owner, raised concerns about the mass of false, incendiary and toxic content that the world's largest video site surfaced and spread. One employee wanted to flag troubling videos, which fell just short of the hate speech rules, and stop recommending them to viewers. Another wanted to track these videos in a spreadsheet to chart their popularity. A third, fretful of the spread of "alt-right" video bloggers, created an internal vertical that showed just how popular they were. Each time they got the same basic response: Don't rock the boat.

The company spent years chasing one business goal above others: "Engagement," a measure of the views, time spent and interactions with online videos. Conversations with over twenty people who work at, or recently left, YouTube reveal a corporate leadership unable or unwilling to act on these internal alarms for fear of throttling engagement.

Wojcicki would "never put her fingers on the scale," said one person who worked for her. "Her view was, 'My job is to run the company, not deal with this.'" This person, like others who spoke to Bloomberg News, asked not to be identified because of a worry of retaliation.

YouTube turned down Bloomberg News' requests to speak to Wojcicki, other executives, management at Google and the board of Alphabet Inc., its parent company. Last week, Neal Mohan, its chief product officer, told The New York Times that the company has "made great strides" in addressing its issues with recommendation and radical content.

And don't miss the wonderful illustration by Graham Roumieu.

YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant [bloomberg.com]