There's a lot of Facebook news out there right now, but I found this particularly fascinating. As revealed in the recent disclosures made by Facebook Whistleblower Frances Haugen to the Securities and Exchange Commission, the company did, in fact, experiment with getting rid of their recommendation algorithms — as recently as 2018. Here's Alex Kantrowitz explaining what happened in Big Technology:
In February 2018, a Facebook researcher all but shut off the News Feed ranking algorithm for .05% of Facebook users. "What happens if we delete ranked News Feed?" they asked in an internal report summing up the experiment. Their findings: Without a News Feed algorithm, engagement on Facebook drops significantly, people hide 50% more posts, content from Facebook Groups rises to the top, and — surprisingly — Facebook makes even more money from users scrolling through the News Feed.
Turning off the News Feed ranking algorithm, the researcher found, led to a worse experience almost across the board. People spent more time scrolling through the News Feed searching for interesting stuff, and saw more advertisements as they went (hence the revenue spike). They hid 50% more posts, indicating they weren't thrilled with what they were seeing. They saw more Groups content, because Groups is one of the few places on Facebook that remains vibrant. And they saw double the amount of posts from public pages they don't follow, often because friends commented on those pages. …
Meaningful Social Interactions — the back and forth comments between friends that Facebook optimizes for — also dropped by 20%. And given how angry some of these exchanges made people, that might not be a bad thing.
This certainly supports Haugen's argument that the algorithm itself is the problem. But otherwise, there's not exactly a clear takeaway. What it is, though, is a weirdly fascinating look inside the company, and how their incessant chase for "growth" and "engagement" lead to entirely different problems with emotionally manipulative algorithm. It's worth noting, too, that while they may have made more money from ads in this limited experiment, that probably wouldn't have been a sustainable business model — eventually, the lack of meaningful user-driven action would drive down the value of the CPM anyway (if Facebook reported honest data, that is). Then, Facebook ads would end up being as much of a crapshoot as anything else. Although, given the extremely limited that I receive on FB or Instagram, this sounds like it would end in another six-of-one scenario. Bad Actors would have certainly figured out a way to exploit an unfiltered chronological Facebook timeline anyway.
Notably lacking from the article is whether or not my mom, who talks to me like every Facebook post she sees is a temporally linear piece of gospel to which everyone else has been exposed, was part of this user experiment.
There's a lot of Facebook news out there right now. I haven't had time to sift through much of it. But if you want to go deeper, here's a Google Doc compiling pretty much every important article about the Facebook papers; Casey Newton, as usual, has done some great reporting as well.
Facebook Removed The News Feed Algorithm In An Experiment. Then It Gave Up. [Alex Kantrowitz / Big Technology]
Image: Public Domain via PixaHive