Steven Pinker: "Tech rots your brain" hysteria is stupid


Supposedly, the use of technology is making it harder for human beings to focus and get things done. As somebody whose focusing and self-management skills have been drastically improved in recent years, thanks to technology (Shout-out to the nice folks at Daylite! Thanks, Twitter, for improving my focus and on-the-spot analysis during conference lectures! DEVONthink PRO, you rock at helping me manage information and think big-picture!), I was already pretty skeptical about this assertion.

Seems that I have no less an authority than Steven Pinker backing up my skepticism.

For a reality check today, take the state of science, which demands high levels of brainwork and is measured by clear benchmarks of discovery. These days scientists are never far from their e-mail, rarely touch paper and cannot lecture without PowerPoint. If electronic media were hazardous to intelligence, the quality of science would be plummeting. Yet discoveries are multiplying like fruit flies, and progress is dizzying. Other activities in the life of the mind, like philosophy, history and cultural criticism, are likewise flourishing, as anyone who has lost a morning of work to the Web site Arts & Letters Daily can attest.

Critics of new media sometimes use science itself to press their case, citing research that shows how "experience can change the brain." But cognitive neuroscientists roll their eyes at such talk. Yes, every time we learn a fact or skill the wiring of the brain changes; it's not as if the information is stored in the pancreas. But the existence of neural plasticity does not mean the brain is a blob of clay pounded into shape by experience.

Experience does not revamp the basic information-processing capacities of the brain. Speed-reading programs have long claimed to do just that, but the verdict was rendered by Woody Allen after he read "War and Peace" in one sitting: "It was about Russia." Genuine multitasking, too, has been exposed as a myth, not just by laboratory studies but by the familiar sight of an S.U.V. undulating between lanes as the driver cuts deals on his cellphone.

New York Times: Mind over Mass Media

Image courtesy Flickr user Hljod.Huskona, via CC.


  1. Technology is like most other things: it’s not inherently good or bad, it’s how you use it. If you lack to willpower to not flip between Facebook and Twitter and Flickr when you’re supposed to be working, then yeah, technology is probably making you dumber because you can’t concentrate. But that’s your fault, not technology’s.

  2. ” Yet discoveries are multiplying like fruit flies, and progress is dizzying”

    Sure, in the last years we had ADSL, Wifi, LCD and plasma monitors, 3G and 4G mobile phones, faster PC and so on, but I’m still waiting for a cure for cancer and cheap clean energy.

    Will the 2K have another Einstein, Newton, Galileo, Leonardo da Vinci… or the best scientist/engineer of the century will be remembered only for having developed something like holographic TV?

    1. If you honestly think that science has only produced shiny new tech, then you’re woefully ignorant of the state of science today. Particularly with biological science, the advances being made are truly remarkable, and many of them are only possible with the new communication technology.

      Curing cancer is not a singular goal. To really “cure” it requires a nearly complete understanding of human physiology at the molecular level. The amount we’ve learned is incredible, and is in no way diminished by not having figured out one of the most challenging problems humans have ever attempted to solve.

      Read some Science or Nature, and prepare to get your head blow’d.

      Additionally, for all the hype that “tech rots your brain” studies have produced, none of them come close to saying what people are reporting on. Trying to publish something like “social media can cause permanent damage to concentration/focus” would get rejected faster than you could tweet.

      1. Plato proclaimed that the written word would weaken people’s memories, making them stupid and unable to think for themselves. Only oratory was the pure form of communication.

        Technically, he was right. I’ve read the Odyssey and the Iliad, and I can’t recite a single stanza from either of them. I depend on the written word to remember it for me, and make up for it in library volume. I only speak a single language, and have to use merely adequate babelfish translations as a crutch to communicate with like-minded people on the other side of the world. Hell, I can’t even remember what I’m planning on doing tomorrow without checking my email calendar.

        So, yeah, from Plato’s point of view I’m probably pretty dumb. Somehow I don’t really mind it.

        1. I think of it as externalizing brain functions. Why waste time memorizing a narrative when you can easily download a copy of the book, the movie based on the book, the remake of the movie based on the book, and the book based on the remake of the movie based on the book? Then go argue with the world about which version sucks the most. You don’t even have to wear pants!

          More seriously though, I think this issue is a great illustration of both how concepts of intelligence are actually products of culture, and how technology effects the cultural production of subjectivities. What would Plato do on the internet? What would the internet do to Plato?

    2. Will the 2K have another Einstein, Newton, Galileo, Leonardo da Vinci…

      Probably not, and tech is responsible. With so many people having access to so much information, it’s becoming increasingly rare for a huge advance to depend on a single person. Instead, many people tend to be involved, each discovering a little piece of the puzzle. This is not a change for the worse as far as science is concerned, but it does make it harder to pick a great scientist for the history books.

  3. “These days scientists are never far from their e-mail, rarely touch paper and cannot lecture without PowerPoint. If electronic media were hazardous to intelligence, the quality of science would be plummeting. Yet discoveries are multiplying like fruit flies, and progress is dizzying.”

    But would there be even MORE progress and discoveries if there were no internet distractions? A control group of scientists is needed which goes without the internet, outside of e-mail to communicate. Only then can such a statement be made.

    -a random scientists two cents

  4. My mother, who was a scientist, always used to complain about people at her job using computers for complex mathematical problems and how it weakened their abilities. It was several decades later that I realized that she was a math savant and had no idea that everybody couldn’t just solve complex problems in their heads.

  5. When Gutenberg came out with his printing press, churches denounced it as the work of the devil since it meant that people could read the Bible for themselves instead of depending on priests for religious teaching. Plus ca change…

  6. After every technological advance, you can expect two things to happen:

    1) One group, almost always made of the more established “elders,” decry it. Usually they claim that it has some negative influence. “It will make you lazy/stupid/immoral/etc.”

    2) Users of the new technology do not, as a group, become lazy/stupid/immoral/etc. As always, individual results may vary, technology-induced or not.

    Plato proclaimed that the written word would weaken people’s memories, making them stupid and unable to think for themselves. Only oratory was the pure form of communication.

    Similar claims were made about the de-latinization of the bible, the printing press, the telegraph, the telephone, computers, cell phones, the internet, and so on, often naysayed by the most respected writers, thinkers, and philosophers of the time.

    I believe this happens for two reasons:
    1) The old often fear or misunderstand the new and
    2) new technology almost always decentralises knowledge and power, threatening the security and hegemony of “the wise.”

    1. i agree with you, however, the printed word DID in fact weaken our memory. certainly a small price to pay for such a large leap forward but we should also be aware of potential setbacks brought about by recent technological advancements.

  7. NPR was talking with Clay Shirky just yesterday I think, and asserting that people have trouble with “deep reading” because we can’t concentrate anymore after so much websurfing. Does anyone really have trouble reading novels or books? I still read a few pages or chapter every day in my downtime (yeah, you know what I’m talking about, pretend you don’t read in there), and I don’t feel less able to enjoy it or retain it.

    Hell, by that line of thinking, high school and college should destroy concentration. At least now I’m not reading a chapter of a textbook, two short stories during a workshop, then five poems and a couple chapters of different novels that I have to read for lit classes … and trying to keep them straight when I move to the next chapters or discuss that poet in class the next day.

  8. Tech is a hazard for the new generation who are drowning in oceans of contaminated information and have developed attention spans of about two seconds. Deep thought is history. No use writng more in this comment. Its over the limit already.

    1. That’s an interesting comment for an article that says the science supports the exact opposite.

  9. In the UK, Baroness Greenfield is making a bit of an “Andrew Wakefield” name for herself by going around the world telling people how Facebook rots the mind. The science says no, but still she goes on.

  10. Maybe Pinker’s quote was taken out of context, but there are a lot of rhetorical and logical fallacies packed into some pretty short paragraphs.

  11. This is poorly done. He even quotes that science has shown that the brain is being rewired, or de-wired, by use of certain technology, and then argues against it with non-scientific arguments, even trashing the science itself.

    Somehow his opinion is more valuable than facts. Interesting. Ironic, actually – perhaps his brain is stuck in virtual reality :)

    1. No, he points out that the brain re-wiring is what it’s supposed to do; that’s how memory and experience works. So studies that show how Xboxes and Twitter and txtspk all ‘rewire’ the brain are showing something trivial and totally expected, and only a befuddled journalist whose brain has been rewired by nicotine and alcohol would think that there was anything remarkable about that.

    1. I wouldn’t trust Zimbardo too much. He has a habit of putting a veneer of science and authority behind what are really just his own opinions. You know the Stanford Prison Experiment that he is famous for? He never published his results in a peer-reviewed journal. He said he wanted to reach the masses so he would publish in unreviewed books instead. That’s the scientific equivalent of claiming you can do 100 pull-ups, but not when anyone is watching.

      If you read his journal articles, he also has a habit of trash-talking other theorists that disagree with him. It’s weird. He has charisma and he’s famous, but in my opinion, he is a pretty poor scientist.

  12. My brain is a mass of clay being shaped by the poundings of the gorillas of experience.

  13. Thanks, Twitter, for improving my focus and on-the-spot analysis during conference lectures!

    Maggie, can you please elaborate on how exactly Twitter enhanced your focusing abilities? Was it the exercise of writing succinct posts, or the habit of reading pithy thoughts? Either way, it sounds less like an innate benefit of using Twitter, and more of a result of your taking the initiative to improve your focus, and relating those improvements to your interaction with Twitter. But maybe you can explain better.

    1. Absolutely.

      I always type notes as I’m listening to lectures, but I found that, just typing and listening, I wasn’t doing much thinking about what questions I had, what points could be expanded into bigger stories, or how what was being said in one lecture connected to another.

      When I type notes and periodically tweet, though, the tweets allow me to do that kind of on-the-spot synthesizing of the information (without taking so much time that I miss important notes). Plus, it keeps me focused on what’s actually being said, rather than allowing me to drift off into zen note-taking state. Which can happen.

  14. It seems you feel our work is not a benefit to the public.
    Replicants are like any other machines.
    They’re either a benefit or a hazard.
    If they are a benefit, it’s not my problem.

  15. Yes and ADD and ADHD diagnoses have been multiplying like fruit flies too. So laser-like focus is not the only effect of new technologies. But more importantly I will shamelessly make the case for the cultivation of something beyond what you might call “computational mind.” It is rooted in the mind’s capacity to be self-aware. And distraction is it enemy. It is what the Dalai Lama, Eckhart Tolle, many others point to. There is a difference between knowledge and wisdom. It really doesn’t matter how sophisticated the technology is. An ape in the control room is still an ape.

  16. OK, tech, technically does not rot your brain, but it does provide a lovely source of procrastination, which rots your brain.

    Why I was just playing Mario Galaxy 2 whilst reading, “the blank slate” . Of course, it was the same page over and over again, which was the case before I had started playing Wii.

  17. I’d expect better from Pinker than a patronizing op-ed piece without very much actual consideration of philosophy and refutation of studies. Pinker’s in the tribe where efficiency and technology trump all other qualities, and it seems he threw a little tantrum on the NYTimes because he’s just so dang tired of being asked a legitimate question by hysterical people. I get being annoyed with the hysteria, but I think he owes the question a little more intellectual consideration. The way I interpret his answer to the question of “Hey, should we be thinking critically about information collection and consumption and how that might be changing human culture and consciousness?” is
    “Get over it, technology is unquestionably awesome and unstoppable, so stop bugging me you old square.” And I’m just not impressed with that answer.

    Books like “Shop Class as Soulcraft” and “The Shallows” provide very considered responses to the question of “Does technology rot your brain”. The wise answer to which is no, but it’s worth taking the time to consider what *is* being changed and what it’s being changed *to*. Pinker’s response is just…shallow.

  18. and this line from his piece:

    “It’s not as if habits of deep reflection, thorough research and rigorous reasoning ever came naturally to people. They must be acquired in special institutions, which we call universities, and maintained with constant upkeep, which we call analysis, criticism and debate. ”

    is total malarkey as well as being patronizing. Deep reflection is NOT maintained by universities. Does he think that deep reflection never came out of a culture without Harvard? The quality of reflection is maintained by people who choose to be that way, and is not conferred by a scholarly degree. His academic bias is showing poorly, he’s confusing smarts with wisdom and reflection.

  19. The joke goes: War and Peace “pertains” to Russia. I think my brain remembered that.

  20. Oh the brutal irony! On “Arts & Letters” front page:

    “Information society? More accurate to call it the interruption society. It pulverizes attention, the scarcest of all resources, and stuffs the mind with trivia…”

  21. Yes, technology is rotting our brains!! And the zombies are eating them, right?

    Well, you try living without the internet, let me know how that one goes.

Comments are closed.