Facebook studied how it polarizes users, then ignored the research

• "Our algorithms exploit the human brain's attraction to divisiveness."

• "64% of all extremist group joins are due to our recommendation tools"

• GOP operative turned Facebook policy VP Joel Kaplan, who threw a party for Brett Kavanaugh upon his Supreme Court confirmation, killed any action on Facebook's internal findings, reports WSJ

Mark Zuckerberg and other top executives at Facebook shelved damning internal research into the social media platform's polarizing effect, which hampered efforts to apply its conclusions to products and minimize harm, reported the Wall Street Journal on Tuesday.

Facebook's own internal research discovered that most people who join extremist groups did so as a result of Facebook's recommendation algorithms.

The company shelved the research, and pressed on, making money and radicalizing Americans.

The Wall Street Journal report by Deepa Seetharaman and Jeff Horwitz is based on company sources and internal documents. One internal Facebook presentation slide from 2018 laid out the issue like this: "Our algorithms exploit the human brain's attraction to divisiveness."

"If left unchecked," it warned, Facebook would feed users "more and more divisive content in an effort to gain user attention & increase time on the platform."

Excerpt:

That presentation went to the heart of a question dogging Facebook almost since its founding: Does its platform aggravate polarization and tribal behavior?

The answer it found, in some cases, was yes.

Facebook had kicked off an internal effort to understand how its platform shaped user behavior and how the company might address potential harms. Chief Executive Mark Zuckerberg had in public and private expressed concern about "sensationalism and polarization."

But in the end, Facebook's interest was fleeting. Mr. Zuckerberg and other senior executives largely shelved the basic research, according to previously unreported internal documents and people familiar with the effort, and weakened or blocked efforts to apply its conclusions to Facebook products.

Facebook policy chief Joel Kaplan, who played a central role in vetting proposed changes, argued at the time that efforts to make conversations on the platform more civil were "paternalistic," said people familiar with his comments.

Read more:
Facebook Executives Shut Down Efforts to Make the Site Less Divisive

[via techmeme.com]