Obedience and fear: What makes people hurt other people?

Discuss

36 Responses to “Obedience and fear: What makes people hurt other people?”

  1. Marc Mielke says:

    I’ve always wondered if any of the subjects behaved really inappropriately. Because far from thinking I would be one of the few who would walk out, I’m a bit afraid I’d get a bit giggly and say things like “Shock-Shock-Shocka Khan!”, “Zzzzzap!” or “buzz buzz buzz!” before giving the fake zap.

  2. Shane Selman says:

    Check out the Radio Lab segment on this – I think it is called just “Bad”.  Milgram’s original experiments were much more involved than is commonly discussed, and his conclusions much more nuanced.  It was not a simple response to authority that got people to continue, it was ( in part ) the idea that what they were doing was important – that the research was important, and they were sacrificing something for the greater good in continuing.

    The 65% compliance rate was a baseline, and compliance rates pretty much went down from there based on who was telling them to continue, and how they were told.  Someone in street clothes telling them had only a 15-20% compliance rate, and authority figures that disagreed on whether to continue had an even lower compliance percentage.  

    Each time a participant showed resistance or concern, they were given one of 4 scripted prompts.  “Please continue” and “The experiment must continue” had the best compliance rates, while “You have no choice” as a prompt had a 0% compliance rate across all of the tests.  Let me repeat that – 0% across all tests.

    The actual conclusions were not that we blindly respond to authority ( we dont ), but that we will do seemingly terrible and irrational things of our own accord, if we can be convinced that it is the right thing to do, or is in service to a greater good.

    • ChicagoD says:

      “You have no choice” as a prompt had a 0% compliance rate across all of the tests.
      That may be the single most interesting thing I’ve ever heard about these tests. Thanks for posting it.

      • jimmoffet says:

        Yeah, that’s really the crucial finding of the whole experiment. It completely contradicts the lay wisdom that comes from the Milgram experiments. It actually demonstrates that simply ordering someone to do something that they don’t agree with is a completely ineffective tactic, even coming from those with a comparatively high level of perceived authority.

    • Ito Kagehisa says:

      Very interesting comments, Shane, but I disagree with your characterization of Milgram’s conclusion.  You said:

      The actual conclusions were not that we blindly respond to authority (we don’ t), but that we will do seemingly terrible and irrational things of our own accord, if we can be convinced that it is the right thing to do, or is in service to a greater good.

      I don’t recall Milgram saying that in any of his papers.  He said that many of us will commit atrocities if we can be persuaded that we are being ordered to do so by a “legitimate authority figure” – not that we believe this explicitly serves any greater good.  The key seems to be that we believe the person we are obeying has some sort of right to command us, within our framework of authority.  The uniform of a clipboard and white lab coat convinces a startling number of people.

      http://www.typinganimal.net/wp/stanley-milgrams-experiment/ is another brief look at the subject.

      • Tynam says:

        This is the most important point; Milgram discovered primarily that we do terrible things out of social pressure rather than because we’ve been ideologically convinced.  An authority figure is an important source of pressure but not the only one. 

        (Experimentation also showed that the compliance rate went up greatly when peer pressure was added, for example.  Experiments 17 and 18 in the initial Milgram set are particularly interesting in this regard.)

      • jimmoffet says:

        It’s important to note that all levels of perceived authority had a 100% failure rate using the most extreme prompt (when they were told that they have no choice). 

        This contradicts the most common cultural reference to the experiments, which is usually trotted out to justify the “I was just following orders” defense. When you feel that the choice is being taken away from you, you are least likely to acquiesce. Conversely, it follows that the decision you make is of your own volition. This is a fairly profound finding when considered in light of guilt or innocence and Milgram has often come up in war crimes trials (the fact that it’s easy to mischaracterize leads it to be used by both sides…). 

    • GinaPerry says:

      It’s true the results are not nearly as straightforward as they’ve been presented. And when you talk to people who took part and read Milgram’s unpublished notes – which I’ve done – you get quite a different view of what was going on in that lab. I think the more interesting question is why we accept stories of experiments like these so unquestioningly. It’s clear from my research that A. many subjects saw through the hoax B. many subjects offered to swap places with the Learner or tried more covert methods such as ‘cheating’ by emphasising the right correct answers to the learner when they read the word pairs. None of these issues gets a mention in Milgram’s published accounts of the research. Here’s a link to my radio documentary about the research.

      http://www.abc.net.au/radionational/programs/radioeye/beyond-the-shock-machine/3183356 
      I’m also author of the book ‘Behind the shock machine. The untold story of the notorious Milgram psychology experiments.’  

  3. unit_1421 says:

    I’d want to see all the follow up questions that were asked to the people once they were told the truth. I’d also like to see the demographics of those selected for the study. Were people selected based on suppositions that they’d be the most likely to comply? I’ve always found Milgram’s study to be highly suspect.

    • Tynam says:

      Later experimentation has independently confirmed results similar to Milgram’s several times.  Even modern replications (modified to much tighter modern-day ethical rules) have produced similar results.

      But you’re right; follow-up on the experimental subjects is particularly interesting.

  4. PJDK says:

    Three things have always bothered me about the Milgram experiments.

    The first is that from the test subjects point of view the person being shocked is a volunteer (they even meet before hand if I recall), this has got to put that person as primarily responsible for their own well being.  Yes they are shouting “make it stop” but if they *really* wanted it to stop they could just get up and walk out.  This leads on to point two.

    If the test subject is meant to conclude the other volunteer can’t really walk out and they are being tricked into torturing someone to death what is the correct moral/rational response meant to be?  This puts you at the mercy of probably the most deranged psychopathic mass murder in history – grab the chair and swing for the researchers heads and run for it seems reasonable.

    Finally, everyone who carried on the experiment thinking “this isn’t very pleasant but it’s for an important scientific experiment and I’m sure this authority figure wouldn’t let anything really bad happen” was fundamentally correct.  That was the situation, and they correctly analysed it as such.

    • A Viescas says:

       A note: if you watch the videos, they also shout “let me out.” So yes, the subjects are led to believe that though the shocked people are “volunteers” who gave consent, they are also led to believe that the shocked people can’t get out themselves. (point two is therefore somewhat naive)

      If something really does “bother you” about the Milgram experiments, I suggest you do some more research to see if the study itself might answer your question, or possibly the commentary on the research. Its only, like, the most studied and referenced social psych experiment ever.

      • PJDK says:

        But that’s what I mean.  If the volunteers are led to believe that the people who they are shocking, who they are initially thought to be volunteers are in fact trapped and entirely at the mercy of the experiment that puts a hugely different spin on things.

        If the person receiving the shocks can’t escape of their own volition why should the person delivering the shocks believe they have any option  but compliance either – the person receiving the shocks has far more reason to call a halt to the whole thing and yet hasn’t.

        • A Viescas says:

          But as the “shocker” is the one pressing the button, calling the whole thing to a halt is pretty simple. The question is if you do or not.

          • PJDK says:

            The shocker is only doing something wrong if the person being shocked is being coerced.  Experiments where a volunteer experiences pain are entirely legitimate, one’s where someone inflicts pain maybe less so but there’s no reason for our shocker to know that.

            So the shocker only has a moral choice to make if they become convinced the shockie is under coercion.  In that circumstance the shocker is under implied coercion since this is not what they originally agreed to.  Backed up by the article  “I didn’t know what was going to happen to me if I stopped. He just—he had no emotion. I was afraid of him.”

          • A Viescas says:

            Okay, so… what’s your problem with the experiments again? Because all this sounds like “experiment conducted as designed.”

  5. chaopoiesis says:

    :s/shock/like/

  6. chuckwaugh says:

    Shermer did NOT “replicate” the experiment as he claims.  If he did, it would be ethically corrupt in this day and age. Posing it as a ‘reality TV show’ is very different than Milgram’s presentation.  

    Shermer gets a failing grade for this and I hope SA’s readership makes that clear. 

  7. Don says:

    The Milgrim experiments are discussed in some depth in this book:  http://home.cc.umanitoba.ca/~altemey/

    Not all Milgrim’s results were discouraging.  If the subject is paired with another person and they share the task of supposedly delivering shocks, and that other person resists the command to deliver a shock, the subject is much less likely to deliver the shock as well.  So resistance to authority may make it easier for others to summon the courage to do the same.

  8. kongjie says:

    I worked on the Milgram papers, currently housed at Yale University. Part of my job was to listen to and catalogue many of the audio recordings. If I recall correctly (it was about 20 years ago), the person being “shocked,” in at least some of the protocols, didn’t act as an actor–it was in fact a recording that was re-used. There were some alternate protocols in which a live actor produced the appeals for the shocker to stop. Again, if memory serves, there was one in which a father was asked to shock his own son (or perhaps vice-versa). For me though the most disturbing ones were when the shocker had no problem going to higher and higher shocks, even after the victim stopped responding at all. Complete with inappropriate laughter, in some instances.

    • Cowicide says:

      Complete with inappropriate laughter, in some instances.

      Could it be that they knew it was fake?

      • kongjie says:

        There was no indication in the ones I listened to that they thought it was fake. Some remarked things like “He’s not answering,” and would giggle when told to continue with the experiment. I believe I recall giggling in response to the screams, too. You have to keep in mind that this was a very different time than today. Sure, there was “Candid Camera” on television, but the level of elaborate pranking that we have on the Internet and TV today didn’t exist then. Many of the sessions I listened to used what sounded to me like working-class people, and they from the start showed a certain amount of deference to the researchers.

        • Cowicide says:

          I recall giggling in response to the screams

          Yes, but did the fake screams sound funny?

          • kongjie says:

            Yes, a couple of the screams sounded a little funny. But that was to me knowing what was going on in 1994; it might have sounded different to someone with their finger on a button in the 1960s.

        • GinaPerry says:

          I agree many people were deferential. Most of the subjects I interviewed, despite living in and near New Haven, had never been inside Yale’s grounds before. Milgram’s follow up questionnaire and unpublished papers show that in fact a significant number of subjects saw through the hoax. Candid Camera was the most popular TV show in the US around that time. Some went into quite a bit of detail about how they worked it out. For example, if a subject ‘cheated’ by pressing a lever that was lower voltage instead of the higher one, the learner’s cry would escalate as if they had been given the higher voltage. Many commented on how the experimenter’s imperviousness to the Learner’s cries was proof that there was something fishy going on. 

          • kongjie says:

            Thanks for filling me in on that. My cataloguing focus didn’t extend to the post-”Obedience” research and so I am in the dark about it.

      • Ito Kagehisa says:

        The laughter thing is interesting, there’s a lot of study on that, actually.  In fact I’ve read several interviews (one very recent) of people involved in the original Milgram experiments, in which people discussed a growing hysteria as the experiment progressed.  There are lots of reasons to believe that many people laugh under extreme psychological duress; in fact I’ve seen it, though I’ve not experienced it myself.

        Milgram said, in response to claims that his subjects knew it was fake, “Orne’s suggestion that the subjects only feigned sweating, trembling, and stuttering to please the experimenter is pathetically detached from reality, equivalent to the statement that hemophiliacs bleed to keep their physicians busy.”

        One of the reasons these experiments aren’t repeated much any more is because they have profound, sometimes permanent effects on the “shockers” which is an ethical violation of the first order – a violation of the Hippocratic Oath and all that stuff.

    • GinaPerry says:

      The father-son variation you listened to at Yale was a variation Milgram kept secret and never published. It was condition 24, the Relationship condition. Instead of using an actor and one subject, Milgram recruited pairs of friends and family members. When the pair arrived at the lab, Milgram took one into another room and ‘coached’ him to scream on cue as his father/brother/best friend conducted the test. It’s not clear why Milgram kept it secret : it was likely because it was such an ethically indefensible thing to do to people and also got such abysmal rates of ‘obedience’. I put that word in quotation marks because I don’t think that’s what he measured at all. I’ve written a chapter on this secret variation in my book
      http://www.gina-perry.com

  9. acerplatanoides says:

    People are animals. Look it up!

  10. hbgvfcdxsz hbgvfcdxsz says:

    There’s a new book out about the experiments “Behind the Shock Machine: the untold story of the notorious Milgram psychology experiments”

    “She investigates Milgram’s character and motivations, revealing him to be perhaps a little too invested in the results going the way they did. She uncovers some evidence contrary to Milgram’s reported findings that he, unsurprisingly, never published. She tracks down previous participants to ask a particularly difficult question – were the psychological effects of the experiment really as ‘negligible’ as Milgram claimed? And she casts some doubt as to whether the experiment really was measuring ‘obedience to authority’.”
    TL;DR ~ More than a 3rd of the test subjects new the actor wasn’t getting zapped

  11. whoknew says:

    I second the recommendation of the Radio lab episode.
    http://www.radiolab.org/2012/jan/09/
    One of the most fascinating podcasts I’ve ever heard.  I have no expertise to evaluate the conclusion, but the scholar they had on the show also highlighted the fact that the study really seems to show that the most powerful motivator is a belief in a cultural or societal imperative such as “the good of the community depends on you delivering this evil.”  It was somehow uplifting to think that the opposite occurred (i.e. people walked out of the test) when they were told they absolutely had to do it based only on the authority of the administrator of the test.

Leave a Reply