How motivated skepticism strengthens incorrect beliefs

This is part two in my "The Backfire Effect" series. This one focuses on motivated reasoning, specifically something called motivated skepticism. In addition, it features interviews with the scientists who coined the backfire effect term itself and who have extended their original research outside of politics and into health issues.

By now you've likely heard of confirmation bias. As a citizen of the internet the influence of this cognitive tendency is constant, and its allure is pervasive.

In short, when you have a hunch that you might already understand something, but don't know for sure, you tend to go searching for information that will confirm your suspicions.

When you find that inevitable confirmation, satisfied you were correct all along, you stop searching. In some circles, the mental signal to end exploration once you feel like your position has sufficient external support is referred to as the wonderfully wordy "makes sense stopping rule" which basically states that once you believe you've made sense of something, you go about your business satisfied that you need not continue your efforts. In other words, just feeling correct is enough to stop your pursuit of new knowledge. We basically had to invent science to stop ourselves from trying to solve problems by thinking in this way.

DownloadiTunesStitcherRSSSoundcloud

Great Courses PlusThis episode is sponsored by The Great Courses Plus. Get unlimited access to a huge library of The Great Courses lecture series on many fascinating subjects. Start FOR FREE with Your Deceptive Mind taught by neurologist Steven Novella. Learn about how your mind makes sense of the world by lying to itself and others. Click here for a FREE TRIAL.

sssThere is no better way to create a website than with Squarespace. Creating your website with Squarespace is a simple, intuitive process. You can add and arrange your content and features with the click of a mouse. Squarespace makes adding a domain to your site simple; if you sign up for a year you'll receive a custom domain for free for a year. Start your free trial today, at Squarespace.com and enter offer code SOSMART to get 10% off your first
purchase.

PatreonSupport the show directly by becoming a patron! Get episodes one-day-early and ad-free. Head over to the YANSS Patreon Page for more details.

You could, instead, try and disconfirm your assumptions, to start your investigations by attempting to debunk your beliefs, but most of the time you don't take this approach. That's not your default method of exploring the natural world or defending your ideological stances.

VaxxFor instance, if you believe that vaccines cause autism, and then you go searching for data that backs up that hypothesis, with the power of search engines you are guaranteed to find it. That's true for just about everything anyone has ever believed whether it's the moon landing was a hoax, the Denver airport is a portal to Hell, or that there is a fern that produces small animals that eat grass and deliver their nutrients into the plant via an umbilical vine.

We even reason through a confirmation bias when searching our memories. In one study, subjects read a story about a woman named Jane. In it, she exhibited some behaviors that could be interpreted as introverted, and some that seemed more extroverted. Several days later, psychologists divided those same subjects into two groups. They told one group that Jane was thinking about applying for a job as a real estate agent, and asked if they thought she was suited to the work. Most people said she would be great at it, and when asked why, those subjects recalled all the extroverted behavior from their memories, citing those parts of the narrative as evidence for their belief. The scientists then said that Jane was also considering a job as a librarian. The subjects groused upon hearing this, saying that Jane was too outgoing for that kind of environment. For the other group, the order was flipped. They first asked if Jane should take a job as a librarian. Just like the other group, most of the subjects said "yes!" right away, taking an affirmative position by default. When asked why they felt that way, they too searched their memories for confirmation that their hunches were correct and cited all the times they remembered Jane had acted shy. When scientists asked this second group if Jane should go for a real-estate job instead, they were adamantly opposed to the idea, saying Jane was obviously too reserved for a career like that.

Confirmation bias is an active, goal-oriented, effortful process. When tasked to defend your position, even if you just took it, even if you could have taken another, you tend to search for proof, pushing past a threatening, attitude-inconsistent thicket to cherry-pick the fruit of validation.

There is another process though that is just as pernicious but that runs in the background, passive, waiting to come online when challenging information is unavoidable, when it arrives in your mind uninvited. This psychological backup plan for protecting your beliefs is called motivated skepticism.

Political scientists Brendan Nyhan and Jason Reifler saw the power of motivated skepticism when they confronted anti-vaxxers with a variety of facts aimed at debunking myths concerning a connection between the childhood MMR vaccine and autism. In this episode of the You Are Not So Smart Podcast, they explain how they were successful at softening those subjects' beliefs in those misconceptions, yet those same people later reported that they were even less likely to vaccinate their children than subjects who received no debunking information at all. The corrections backfired.

As I've written before, "when your deepest convictions are challenged by contradictory evidence, your beliefs get stronger." In this episode of the You Are Not So Smart podcast, the second in a series on the The Backfire Effect, we explore how motivated skepticism fuels this bizarre phenomenon by which correcting misinformation can cause people to become even more certain in their incorrect beliefs. (This is a link to part one in the series).

This episode's cookie is espresso dark chocolate sent in by Sarah Hendrickson.

Links and Sources

• The Makes-Sense Stopping Rule: Perkins, D. N., Farady, M., & Bushey, B. In Voss, J. F., Perkins, D. N., & Segal, J. W. (1991). Informal reasoning and education. Hillsdale, N.J: L. Erlbaum Associates.

• Jane Confirmation Bias Study: Snyder, Mark, and Nancy Cantor. "Testing Hypotheses about Other People: The Use of Historical Knowledge." Journal of Experimental Social Psychology 15.4 (1979): 330-42.

• Vaccine Corrections Study: Nyhan, B., J. Reifler, S. Richey, and G. L. Freed. "Effective Messages in Vaccine Promotion: A Randomized Trial." Pediatrics 133.4 (2014).

DownloadiTunesStitcherRSSSoundcloud

Previous Episodes

Part One of this Series

Boing Boing Podcasts

Cookie Recipes

The Backfire Effect

Effective Messages in Vaccine Promotion: A Randomized Trial

Study: You Can't Change an Anti-Vaxxer's Mind

Vaccine Opponents Can Be Immune to Education

Brendan Nyhan on Twitter

Brendan Nyhan's Website

Jason Reifler's Twitter

Jason Reifler's Website

Music in this episode donated by: Mogwai