An important, elegant thought experiment on content moderation regulation

Kate Klonick (previously) logged into Twitter to find that her trending topics were: "Clarence Thomas," "#MakeADogsDay," "Adam Neumann" and "#Lynching" (if you're reading this in the future, Thomas is the subject of a new documentary and Trump just provoked controversy by characterizing impeachment proceedings as a "lynching.") Read the rest

The platforms suck at content moderation and demanding they do more won't make them better at it -- but there ARE concrete ways to improve moderation

Concentration in the tech sector has left us with just a few gigantic online platforms, and they have turned into playgrounds for some of the worst people on earth: Nazis, misogynists, grifters, ultranationalists, trolls, genocidal mobs and more. The platforms are so big and their moderation policies are so screwed up, and their use of "engagement" algorithms to increase pageviews, that it's making many of us choose between having a social life with the people we care about and being tormented by awful people. Even if you opt out of social media, you can't opt out of being terrorized by psychopathic trolls who have been poisoned by Alex Jones and the like. Read the rest

The platforms control our public discourse, and who they disconnect is arbitrary and capricious

Look, I'm as delighted as you are to see Alex Jones' ability to spread hatred curtailed -- because in a world where all the important speech takes place online, and where online speech is owned by four or five companies, being kicked off of Big Tech's services is likely to be an extinction-level event. Read the rest