A successful no-platforming means we can talk about Alex Jones again

Zeynep Tufekci (previously) says that Big Tech's "engagement maximization" algorithms meant that any time you talked about Alex Jones critically, the algorithms would start relentlessly recommending that you watch some Alex Jones videos, because they were so well designed to please the algorithms by sucking up our attention.


The combination of attention-maximization algorithms and Jones's perfect playing of them meant that any talk about Jones was a way to drive support to Jones.

But now that Jones has been banished from the super-concentrated Big Tech platforms, it's once again possible to talk about him without causing algorithms to barf up his autoplaying videos.


Tufekci's argument is different from "don't feed the trolls" — the idea that merely discussing the ideas of attention-seeking monsters is a bad idea because it gives them the attention they thrive on (and require, to reach their audiences). Rather, Tufekci is saying that the platforms' attention-maximization systems made the "don't feed the trolls" rule of thumb into something that was literally true: simply mentioning the troll would increase the spread of their videos by autosuggested, autoplaying algorithms that wanted you to spend more time on the site.


The tech platforms have arbitrary power to decide what to amplify, and thus what to bury, and they have the power to banish as they wish. There is nothing aside from backlash to stop them from deplatforming, say, tech critics or politicians who call for shutting tax loopholes for massive corporations. Without due process or accountability, a frustrated public is left with appealing to a few powerful referees—and crossing our fingers.

This is complicated stuff. We're dealing with three ideas that are structurally in tension: that hate speech, harassment, false accusations, and baseless conspiracies (like antivaccination claims) cause real harm; that free speech is a crucial value; and that it's necessary to deal with algorithmic amplification and attention-­gamers.

Legislators, courts, users, and the platforms themselves have to be involved. There are some precedents we could use from older technologies. Some updated version of the fairness doctrine, which required radio and television stations to devote time to issues of public importance and seek out a multiplicity of views, could be revived for the digital age. We could come up with a kind of Fair Credit Reporting Act that gives users a right to challenge a platform banishment. There could be antitrust actions against centralized platforms (along with user protections), or upstarts could offer alternatives (with better business models). As with most social problems, we have to accept that there is no single, perfect solution, no avoiding trade-offs, and also that inaction is a decision too.

'He Who Must Not Be Named': What Alex Jones and Voldemort Have in Common [Zeynep Tufekci/Wired]