Frances Haugen, a former manager at Facebook, appeared on 60 Minutes to blow the whistle on the extent and intentionality of the company's malign and amoral practices: "Facebook, over and over again, has shown it chooses profit over safety."
She said Facebook prematurely turned off safeguards designed to thwart misinformation and rabble rousing after Joe Biden defeated Donald Trump last year, alleging that contributed to the deadly Jan. 6 invasion of the U.S. Capitol.
Post-election, the company dissolved a unit on civic integrity where she had been working, which Haugen said was the moment she realized "I don't trust that they're willing to actually invest what needs to be invested to keep Facebook from being dangerous."
At issue are algorithms that govern what shows up on users' news feeds, and how they favor hateful content. Haugen said a 2018 change to the content flow contributed to more divisiveness and ill will in a network ostensibly created to bring people closer together.
Haugen will testify before congress this week. Her appearances have the company rattled enough that it tried to pre-empt the 60 Minutes report with a memo written by top Facebook executive Nick Clegg.
Of course, everyone has a rogue uncle or an old school classmate who holds strong or extreme views we disagree with – that's life – and the change meant you are more likely to come across their posts too. Even so, we've developed industry-leading tools to remove hateful content and reduce the distribution of problematic content. As a result, the prevalence of hate speech on our platform is now down to about 0.05%.