Russell Holly, managing editor for commerce at CNET, was so excited about scoring an advance screening to the Denis Villeneuve's movie adaptation of Frank Herbert's Dune that he went on Facebook and posted a clip from David Lynch's 1984 version, that has the crazy knife fight between Sting and Kyle MacLachlan. So Holly's colleague, Jason Perlow (another Dune fan) commented on Holly's post by quoting Sting's classic line "I -WILL- kill you."
"I even put it in quotes so that there was no question I was quoting the film," wrote Perlow in his essay. Here's what happened next:
I thought nothing of it. I went about the rest of my evening. About an hour later, I was notified by Facebook that I was suspended for three days due to violating Community Standards.
I was shocked. Suspended for quoting a film? Without even using any obscenities? This seems… extreme.
Obviously, I had no intention of killing Russell Holly, envious as I was that he got to see this film months before anyone else. I am also not in the practice of murdering my editorial colleagues with poisoned daggers, as anyone at ZDNet will tell you.
This miffed Perlow, especially since Facebook seems to be reluctant to do much about covidiots and antivaxxers, who lies really can lead to death. In face says Perlow, Facebook actively promotes Covid and vaccine misinformation:
Quoting movies doesn't hurt or result in the death of anyone. But do you know what does? Spreading misinformation about vaccines and COVID-19. That absolutely will kill people.
On July 20, the internet news watchdog NewsGuard presented a report to the World Health Organization. The report's conclusion: Not only has Facebook failed to be proactive in the removal of misinformation about vaccines and COVID-19, but the social platform is actively enabling and accelerating its spread.
How so? Many high-volume and extremely popular Pages on Facebook representing "Red" classified news websites (failing to meet NewsGuard basic standards of credibility and transparency) are spreading false information and are outright medically and scientifically inaccurate about COVID-19, vaccines, masks, 5G, and other health-related topics.
Many of these pages have tens of thousands of followers. When these pages are "liked" by Facebook users, other Pages that publish misinformation about these topics are recommended by Facebook's algorithm, sending users down a never-ending rabbit hole of meme-fueled hoaxes and conspiracies.
The more you click, the more Facebook recommends similar pages.
It's understandable why Facebook would want to take down a message saying "I will kill you." One day it will probably have an algorithm that can distinguish between movie quotes used as a fake death threat and an actual death threat, but for now it is choosing to play it safe. And I can see why Facebook wouldn't do much to stop antivaxxers, because Facebook's audience is increasingly made up of people who like misinformation like this. In other words, nothing to see here folks. Move along.