Facebook supposely bans militias, but they're there all the same, reports Wired's Tess Owen—and using the plaform to coordinate their activities.
"Join Your Local Militia or III% Patriot Group," a post urged the more than 650 members of a Facebook group called the Free American Army. Accompanied by the logo for the Three Percenters militia network and an image of a man in tactical gear holding a long rifle, the post continues: "Now more than ever. Support the American militia page."
Other content and messaging in the group is similar. And despite the fact that Facebook bans paramilitary organizing and deemed the Three Percenters an "armed militia group" on its 2021 Dangerous Individuals and Organizations List, the post and group remained up until WIRED contacted Meta for comment about its existence.
Free American Army is just one of around 200 similar Facebook groups and profiles, most of which are still live, that anti-government and far-right extremists are using to coordinate local militia activity around the country.
When companies talk vaguely of the challenges of content moderation I'm always reminded of how pre-Musk Twitter would hem and haw and shrug its shoulders at the impossibility of moderating racial slurs and whatnot, but could make tweets containing copyright-infringing strings vanish in a heartbeat. It's true that content moderation is hard, and harder still at scale, but it's also true that if something is on someone's website it's because they've made a choice that allows it to remain. If Facebook wanted to rid itself of this material it would. It doesn't matter how obviously worthless or bad the content is because all that matters is the scale of it. See also: reviews on retail sites.
Previously: Facebook pushing AI-generated images of starving, drowning, bruised and mutilated children into users` feeds