Back in 2017, I started writing about the "epistemological crisis" ("we're not living through a crisis about what is true, we're living through a crisis about how we know whether something is true. We're not disagreeing about facts, we're disagreeing about epistemology"); danah boyd picked up on that theme later that year, making the connection between "media literacy" education and the crisis ("If we're not careful, 'media literacy' and 'critical thinking' will simply be deployed as an assertion of authority over epistemology").
I've been developing the idea since, connecting it to inequality and oligarchy; and so has boyd: her latest shows how the epistemological crisis gives rise to far-right conspiracies through "Agnotology," ("the strategic and purposeful production of ignorance").
But what's most profound is how it's being done en masse now. Teenagers aren't only radicalized by extreme sites on the web. It now starts with a simple YouTube query. Perhaps you're a college student trying to learn a concept like "social justice" that you've heard in a classroom. The first result you encounter is from PragerU, a conservative organization that is committed to undoing so-called "leftist" ideas that are taught at universities. You watch the beautifully produced video, which promotes many of the tenets of media literacy. Ask hard questions. Follow the money. The video offers a biased and slightly conspiratorial take on what "social justice" is, suggesting that it's not real, but instead a manufactured attempt to suppress you. After you watch this, you watch more videos of this kind from people who are professors and other apparent experts. This all makes you think differently about this term in your reading. You ask your professor a question raised by one of the YouTube influencers. She reacts in horror and silences you. The videos all told you to expect this. So now you want to learn more. You go deeper into a world of people who are actively anti-"social justice warriors." You're introduced to anti-feminism and racial realism. How far does the rabbit hole go?
YouTube is the primary search engine for people under 25. It's where high school and college students go to do research. Digital Public Library of America works with many phenomenal partners who are all working to curate and make available their archives. Yet, how much of that work is available on YouTube? Most of DPLA's partners want their content on their site. They want to be a destination site that people visit. Much of this is visual and textual, but are there explainers made about this content that are on YouTube? How many scientific articles have video explainers associated with them?
Herein lies the problem. One of the best ways to seed agnotology is to make sure that doubtful and conspiratorial content is easier to reach than scientific material. And then to make sure that what scientific information is available, is undermined. One tactic is to exploit "data voids." These are areas within a search ecosystem where there's no relevant data; those who want to manipulate media purposefully exploit these. Breaking news is one example of this. Another is to co-opt a term that was left behind, like social justice. But let me offer you another. Some terms are strategically created to achieve epistemological fragmentation. In the 1990s, Frank Luntz was the king of doing this with terms like partial-birth abortion, climate change, and death tax. Every week, he coordinated congressional staffers and told them to focus on the term of the week and push it through the news media. All to create a drumbeat.
Agnotology and Epistemological Fragmentation [danah boyd/Points]