The epic battle inside YouTube over content moderation and recommendation algorithms is detailed brilliantly in a damning new report from Mark Bergen at Bloomberg News.
New story: I spent weeks talking to folks who've worked at YouTube (and Google) about how the company has wrestled with recommendations, conspiracy theories and radicalism. https://t.co/FHmpHPyaz3
— Mark Bergen (@mhbergen) April 2, 2019
Bottom line: YouTube failed to moderate extremely harmful content on its platform, and we are all paying the price.
"Scores of people inside YouTube and Google, its owner, raised concerns about the mass of false, incendiary and toxic content that the world's largest video site surfaced and spread… Each time they got the same basic response: Don't rock the boat."
Executives sacrificed everything for engagement, allowing conspiracies and disinfo campaigns to spread, and killing proposals from workers that would have changed the technology that recommends these toxic videos, staff say.
The conundrum isn't just that videos questioning the moon landing or the efficacy of vaccines are on YouTube. The massive "library," generated by users with little editorial oversight, is bound to have untrue nonsense. Instead, YouTube's problem is that it allows the nonsense to flourish. And, in some cases, through its powerful artificial intelligence system, it even provides the fuel that lets it spread.
Wojcicki and her deputies know this. In recent years, scores of people inside YouTube and Google, its owner, raised concerns about the mass of false, incendiary and toxic content that the world's largest video site surfaced and spread. One employee wanted to flag troubling videos, which fell just short of the hate speech rules, and stop recommending them to viewers. Another wanted to track these videos in a spreadsheet to chart their popularity. A third, fretful of the spread of "alt-right" video bloggers, created an internal vertical that showed just how popular they were. Each time they got the same basic response: Don't rock the boat.
The company spent years chasing one business goal above others: "Engagement," a measure of the views, time spent and interactions with online videos. Conversations with over twenty people who work at, or recently left, YouTube reveal a corporate leadership unable or unwilling to act on these internal alarms for fear of throttling engagement.
Wojcicki would "never put her fingers on the scale," said one person who worked for her. "Her view was, 'My job is to run the company, not deal with this.'" This person, like others who spoke to Bloomberg News, asked not to be identified because of a worry of retaliation.
YouTube turned down Bloomberg News' requests to speak to Wojcicki, other executives, management at Google and the board of Alphabet Inc., its parent company. Last week, Neal Mohan, its chief product officer, told The New York Times that the company has "made great strides" in addressing its issues with recommendation and radical content.
And don't miss the wonderful illustration by Graham Roumieu.
YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant [bloomberg.com]
Data from @MoonshotCVE shows less than 20 channels spreading anti-vax theories reached over 170 million people.@micahsch, one of YT's earliest employees: "YouTube should never have allowed dangerous conspiracy theories to become such a dominant part of the platform's culture."
— Mark Bergen (@mhbergen) April 2, 2019
One of most telling parts, for me, is Project Bean, the effort that didn't come to pass. Some involved diagnosed it as to benign neglect; others cited a paralysis and fear of big changes. YouTube's business is complicated! pic.twitter.com/OsH73yZL6b
— Mark Bergen (@mhbergen) April 2, 2019
YouTube has since shifted its model — the OKRs even! — to "responsible growth." New figures here the millions that see its information panels on conspiracies and take satisfaction surveys. But as good @kevinroose invu shows, this change is hard to grok https://t.co/ADA2bovRzt
— Mark Bergen (@mhbergen) April 2, 2019