This is the first of three You Are Not So Smart episodes about the "backfire effect." In it, I interview a team of neuroscientists who put people in a brain scanner and then challenged their beliefs, some political and some not, with counter-evidence and then compared which brain regions lit up for which beliefs. The crazy takeaway was that for political beliefs, but not for others, people seemed to react as if their very bodies were being threatened by the challenging evidence.
We don’t treat all of our beliefs the same.
If you learn that the Great Wall of China isn’t the only man-made object visible from space, and that, in fact, it’s actually very difficult to see the Wall compared to other landmarks, you update your model of reality without much fuss. Some misconceptions we give up readily, replacing them with better information when alerted to our ignorance.
For others constructs though, for your most cherished beliefs about things like climate change or vaccines or Republicans, instead of changing your mind in the face of challenging evidence or compelling counterarguments, you resist. Not only do you fight belief change for some things and not others, but if you successfully deflect such attacks, your challenged beliefs then grow stronger.
The research shows that when a strong-yet-erroneous belief is challenged, yes, you might experience some temporary weakening of your convictions, some softening of your certainty, but most people rebound and not only reassert their original belief at its original strength, but go beyond that and dig in their heels, deepening their resolve over the long run. Read the rest