Jonah Lehrer takes to The New Yorker to discuss Thinking, Fast and Slow, the latest book from Daniel Kahneman, a psychologist who's won the Nobel prize in economics. Lehrer discusses Kahneman's contention that smart people are no less prone to cognitive bias than anyone else, but are prone to believing that they are immune to error. Kahneman himself admits that he makes systematic cognitive errors all the time, even though he's devoted his career to studying them.
This has particularly grim implications for a society that thinks it is a meritocracy but is really an oligarchy, because the competitively educated people at the top believe (incorrectly) that they don't need to have their intuitions reviewed by lesser mortals.
And here's the upsetting punch line: intelligence seems to make things worse. The scientists gave the students four measures of "cognitive sophistication." As they report in the paper, all four of the measures showed positive correlations, "indicating that more cognitively sophisticated participants showed larger bias blind spots." This trend held for many of the specific biases, indicating that smarter people (at least as measured by S.A.T. scores) and those more likely to engage in deliberation were slightly more vulnerable to common mental mistakes. Education also isn't a savior; as Kahneman and Shane Frederick first noted many years ago, more than fifty per cent of students at Harvard, Princeton, and M.I.T. gave the incorrect answer to the bat-and-ball question.
What explains this result? One provocative hypothesis is that the bias blind spot arises because of a mismatch between how we evaluate others and how we evaluate ourselves. When considering the irrational choices of a stranger, for instance, we are forced to rely on behavioral information; we see their biases from the outside, which allows us to glimpse their systematic thinking errors. However, when assessing our own bad choices, we tend to engage in elaborate introspection. We scrutinize our motivations and search for relevant reasons; we lament our mistakes to therapists and ruminate on the beliefs that led us astray.
The problem with this introspective approach is that the driving forces behind biases—the root causes of our irrationality—are largely unconscious, which means they remain invisible to self-analysis and impermeable to intelligence. In fact, introspection can actually compound the error, blinding us to those primal processes responsible for many of our everyday failings. We spin eloquent stories, but these stories miss the point. The more we attempt to know ourselves, the less we actually understand.