Media Matters has found that Tik Tok's algorithm has identified that interest in transphobic videos is a signal that showing right-wing GQP style videos will generate even more views, incidentally, this helps radicalize folks.
TikTok has long been scrutinized for its dangerous algorithm, viral misinformation, and hateful video recommendations, yet this new research demonstrates how the company's recommendation algorithm can quickly radicalize a user's FYP.
Transphobia is deeply intertwined with other kinds of far-right extremism, and TikTok's algorithm only reinforces this connection. Our research suggests that transphobia can be a gateway prejudice, leading to further far-right radicalization.
To assess this phenomenon, Media Matters created a new TikTok account and engaged only with content we identified as transphobic. This included accounts that had posted multiple videos which degrade trans people, insist that there are "only two genders," or mock the trans experience. We coded approximately the first 450 videos fed to our FYP. Even though we solely interacted with transphobic content, we found that our FYP was increasingly populated with videos promoting various far-right views and talking points.