Zeynep Tufekci was researching Trump videos on Youtube back in 2016 when she noticed something funny: Youtube began recommending and autoplaying increasingly extreme right-wing stuff -- like white-supremacist Holocaust-denial videos.
So she did an interesting experiment: She set up another Youtube account and began watching videos for the main Democratic presidential contenders, Hillary Clinton and Bernie Sanders. The result? As Tufecki writes in the New York Times:
Before long, I was being directed to videos of a leftish conspiratorial cast, including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11. As with the Trump videos, YouTube was recommending content that was more and more extreme than the mainstream political fare I had started with.
Intrigued, I experimented with nonpolitical topics. The same basic pattern emerged. Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons.
It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.
This is an incredibly interesting and subtle point: That the problems of Youtube's recommender algorithms might be that they overdistil your preferences. Since they're aiming for "engagement" -- a word I am beginning to loathe with an unsettling level of emotion -- the real problem with these algorithms is they're constantly aiming to create an epic sense of drama and newness. Read the rest