Researchers at New York University found that that right-wing misinformation on Facebook was much more likely to generate "engagement" there—likes, shares and comments—than left-wing misinformation.
The results provide evidence that right-wing sources of misinformation are some of the most engaging content creators on Facebook, said Laura Edelson, a researcher at NYU's Cybersecurity for Democracy initiative. "My takeaway is that, one way or another, far-right misinformation sources are able to engage on Facebook with their audiences much, much more than any other category," Edelson said. "That's probably pretty dangerous on a system that uses engagement to determine what content to promote."
There are many ways to interpret this conclusion; the researchers "steer clear" of "trying to explain why right-wing misinformation is so highly engaging".
Some are interpreting it as a simple reminder that right-wingers are Facebook addicts being served highly-optimized grievances, which is true enough.
Others see it as a story about a pro-conservative bias at Facebook-the-company, which I don't think is warranted by this particular research, though its certainly reasonable to ask if Facebook is algorithmically (or directly, as in one recent report) favoring right-wing misinformation.
Another way of looking at it is that left-leaning audiences are more resistant to rage bait, pandering, "grifting", whatever terminology we want to use for that kind of media. Yet another is to see it as a statistical phenomenon: if reality has a liberal bias, perhaps unreality has a conservative one.
One thing I'd be wary of is the reliability of companies, lurking in the background of this research, which present themselves as impartial judges of "journalistic integrity". They are tech startups, their intended customer is the advertising industry, and they have perverse incentives: the "adblocker" model applied to fake news.