"Knowledge overconfidence is associated with anti-consensus views on controversial scientific issues," runs the title of a recent study published in Science which highlights the consequences of claiming to know something you don't. Misrepresenting, it transpires, could be a form of knowledge overconfidence.
"Public attitudes that are in opposition to scientific consensus can be disastrous and include rejection of vaccines and opposition to climate change mitigation policies. Five studies examine the interrelationships between opposition to expert consensus on controversial scientific issues, how much people actually know about these issues, and how much they think they know. Across seven critical issues that enjoy substantial scientific consensus, as well as attitudes toward COVID-19 vaccines and mitigation measures like mask wearing and social distancing, results indicate that those with the highest levels of opposition have the lowest levels of objective knowledge but the highest levels of subjective knowledge. Implications for scientists, policymakers, and science communicators are discussed."
Akin to confirmation bias, denials and disavowals of scientific studies and "willed innocence," to quote James Baldwin, sanctify knowledge overconfidence as a conservative ideological position with devastating consequences for social life.
Sociologist Avery Gordon names this deeply theorized claim to knowledge, a willful indifference and ignorance, where opinion substitutes for analysis, argument, and evidence, as a form of anti-intellectualism. Anti-intellectualism is "an agreement not to think" and "has almost always been a form of conservatism, with a particularly long and honored tradition in the United States… [it] ultimately involves refusing to theorize, refusing to reflect critically, and refusing to see the operating general assumptions in society…."
This post might be a dizzying and off-balanced solipsistic veneration of Sisyphus. The unruly rock that refusing to stay put on the top of the mountain is a metaphor and seed for new research – what happens when people who do not believe in scientific studies encounter a study that scientifically explains why they deny science?
It turns out that conceding not knowing something is a form of intellectual humility. Read more in the Behavioral Scientistarticle, "The benefits of admitting when you don't know."
"Our hypothesis was that adopting a growth mindset could help. Growth mindset is the belief that intelligence is something that can change over time. In contrast, fixed mindset is the belief that intelligence is permanent, something people are "born with."
I am always suspicious of discussing genetic permanence, immutable behavior as a cultural trait, and other binaries that can pathologize differences and weaponize public policy against specific targeted groups of people in particular geographies. I also have some friendly "growth mindset" doubts about the binary assumptions as a premise for measurement and conclusion.
Back to back with Sisyphus. The person with a "fixed mindset" is not open to learning or changing one's mind, and believes intelligence is permanent: Is it even possible to have a dialogue or discussion with someone who has already concluded about everything? To be clear: there is no negotiating with a fascist. Is that a fixed mindset? Is that why binaries can be a trap?
Again, from the Behavioral Scientists, "Of course, there's a lot about intellectual humility that we don't yet understand. But the burgeoning empirical research suggests that intellectual humility can benefit learning and perhaps bridge ideological gaps. We all, not just school-age children, might be a bit better off by learning to say 'I disagree with myself' every now and then."
In addition to cultivating a critical practice of self-disagreement, it might also be helpful to ask: where do I get my information from? What are my biases? What are the limitations of my analysis? What elements of history do I not know about so I can ignore them without trying? What do I not allow myself to consider as possible or accurate?
Anti-intellectualism, willed ignorance, and knowledge overconfidence is portrayed with searing and sometimes scary accuracy in the "Jordan Klepper Fingers the Pulse" series.