The platforms control our public discourse, and who they disconnect is arbitrary and capricious

Look, I'm as delighted as you are to see Alex Jones' ability to spread hatred curtailed — because in a world where all the important speech takes place online, and where online speech is owned by four or five companies, being kicked off of Big Tech's services is likely to be an extinction-level event.

And yeah, it's cute to see him wander from platform to platform, looking for a home, while "Conservatives" wake up and discover that 40 years of Ronald Reagan antitrust-lite policies have given a handful of shareholder-driven tech companies control over public discourse (I call it "reaganfreude").

But as David Greene — civil liberties director at the Electronic Frontier Foundation — writes in the Washington Post, the big picture here is terrible.

Because the first victims of the platforms' willingness to censor unpopular speech wasn't Alex Jones: it was trans activists, dissidents in autocracies, women fleeing abusers, Black Lives Matter, and other people who faced reprisals for their real-world speech.

The platforms' version of policing bad speech is sloppy, capricious and arbitrary. People get censored for discussing terrorist atrocities, while actual videos of terrorist atrocities stay up. Millions of accounts are disconnected for being bots, with no recourse for actual activists who are caught like dolphins in that big ole tuna net. Real protests are delisted for being "inauthentic," while Nazis organize in the open.

Greene has a very, very modest proposal for how the platforms should conduct censorship, based on the widely accepted "Santa Clara Principles" on Transparency and Accountability in Content Moderation:

1. The companies should publish up-to-date stats on which posts and accounts they've shut down;

2. The companies should notify you when your post or account is flagged or removed;

3. You should have a right to appeal takedowns, and the rules should be evenhandedly enforced.

These are, as I say, modest goals. They're a lot more likely to produce good takedowns and healthy online forums that disconnecting people by the millions using algorithms, or picking them off one by one only when the public outcry gets loud enough.

David Kaye, the special rapporteur for the United Nations on the promotion and protection of the right to free expression, recommended in a recent report that private companies — and governments — should as a routine matter consider the impact that content moderation policies have on human rights. He recommends that governments not pressure private companies to implement policies that interfere with people's right to free expression online.

The power that these platforms have over the online public sphere should worry all of us, no matter whether we agree or disagree with a given content decision. A decision by any one of them has a huge effect. Even worse, if other companies move in lock step, a speaker may effectively be forced offline.

Transparency in these companies' ­content-moderation decisions is essential. We must demand that they apply their rules consistently and provide clear, accessible avenues for meaningful appeal.

Alex Jones is far from the only person tech companies are silencing [David Greene/Washington Post]

(Image: Simone Giertz)