In 1961, Yale University psychology professor Stanley Milgram placed an advertisement in the New Haven Register. “We will pay you $4 for one hour of your time,” it read, asking for “500 New Haven men to help us complete a scientific study of memory and learning.”
Only part of that was true. Over the next two years, hundreds of people showed up at Milgram’s lab for a learning and memory study that quickly turned into something else entirely. Under the watch of the experimenter, the volunteer—dubbed “the teacher”—would read out strings of words to his partner, “the learner,” who was hooked up to an electric-shock machine in the other room. Each time the learner made a mistake in repeating the words, the teacher was to deliver a shock of increasing intensity, starting at 15 volts (labeled “slight shock” on the machine) and going all the way up to 450 volts (“Danger: severe shock”). Some people, horrified at what they were being asked to do, stopped the experiment early, defying their supervisor’s urging to go on; others continued up to 450 volts, even as the learner pled for mercy, yelled a warning about his heart condition—and then fell alarmingly silent. In the most well-known variation of the experiment, a full 65 percent of people went all the way.
Until they emerged from the lab, the participants didn’t know that the shocks weren’t real, that the cries of pain were pre-recorded, and that the learner—railroad auditor Jim McDonough—was in on the whole thing, sitting alive and unharmed in the next room. They were also unaware that they had just been used to prove the claim that would soon make Milgram famous: that ordinary people, under the direction of an authority figure, would obey just about any order they were given, even to torture. It’s a phenomenon that’s been used to explain atrocities from the Holocaust to the Vietnam War’s My Lai massacre to the abuse of prisoners at Abu Ghraib. “To a remarkable degree,” Peter Baker wrote in Pacific Standard in 2013, “Milgram’s early research has come to serve as a kind of all-purpose lightning rod for discussions about the human heart of darkness.”
In some ways, though, Milgram’s study is also—as promised—a study of memory, if not the one he pretended it was.
More than five decades after it was first published in the Journal of Abnormal and Social Psychology in 1963, it’s earned a place as one of the most famous experiments of the 20th century. Milgram’s research has spawned countless spinoff studies among psychologists, sociologists, and historians, even as it’s leapt from academia into the realm of pop culture. It’s inspired songs by Peter Gabriel (lyrics: “We do what we’re told/We do what we’re told/Told to do”) and Dar Williams (“When I knew it was wrong, I played it just like a game/I pressed the buzzer”); a number of books whose titles make puns out of the word “shocking”; a controversial French documentary disguised as a game show; episodes of Law and Order and Bones; a made-for-TV movie with William Shatner; a jewelry collection (bizarrely) from the company Enfants Perdus; and most recently, the biopic The Experimenter, starring Peter Sarsgaard as the title character—and this list is by no means exhaustive.
But as with human memory, the study—even published, archived, enshrined in psychology textbooks—is malleable. And in the past few years, a new wave of researchers have dedicated themselves to reshaping it, arguing that Milgram’s lessons on human obedience are, in fact, misremembered—that his work doesn’t prove what he claimed it does.
The problem is, no one can really agree on what it proves instead.
* * *
To mark the 50th anniversary of the experiments’ publication (or, technically, the 51st), the Journal of Social Issues released a themed edition in September 2014 dedicated to all things Milgram. “There is a compelling and timely case for reexamining Milgram’s legacy,” the editors wrote in the introduction, noting that they were in good company: In 1964, the year after the experiments were published, fewer than 10 published studies referenced Milgram’s work; in 2012, that number was more than 60.
It’s a trend that surely would have pleased Milgram, who crafted his work with an audience in mind from the beginning. “Milgram was a fantastic dramaturg. His studies are fantastic little pieces of theater. They’re beautifully scripted,” said Stephen Reicher, a professor of psychology at the University of St. Andrews and a co-editor of the Journal of Social Issues’ special edition. Capitalizing on the fame his 1963 publication earned him, Milgram went on to publish a book on his experiments in 1974 and a documentary, Obedience, with footage from the original experiments.
But for a man determined to leave a lasting legacy, Milgram also made it remarkably easy for people to pick it apart. The Yale University archives contain boxes upon boxes of papers, videos, and audio recordings, an entire career carefully documented for posterity. Though Milgram’s widow Alexandra donated the materials after his death in 1984, they remained largely untouched for years, until Yale’s library staff began to digitize all the materials in the early 2000s. Able to easily access troves of material for the first time, the researchers came flocking.
“There’s a lot of dirty laundry in those archives,” said Arthur Miller, a professor emeritus of psychology at Miami University and another co-editor of the Journal of Social Issues. “Critics of Milgram seem to want to—and do—find material in these archives that makes Milgram look bad or unethical or, in some cases, a liar.”
One of the most vocal of those critics is Australian author and psychologist Gina Perry, who documented her experience tracking down Milgram’s research participants in her 2013 book Behind the Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments. Her project began as an effort to write about the experiments from the perspective of the participants—but when she went back through the archives to confirm some of their stories, she said, she found some glaring issues with Milgram’s data. Among her accusations: that the supervisors went off script in their prods to the teachers, that some of the volunteers were aware that the setup was a hoax, and that others weren’t debriefed on the whole thing until months later. “My main issue is that methodologically, there have been so many problems with Milgram’s research that we have to start re-examining the textbook descriptions of the research,” she said.
But many psychologists argue that even with methodological holes and moral lapses, the basic finding of Milgram’s work, the rate of obedience, still holds up. Because of the ethical challenge of reproducing the study, the idea survived for decades on a mix of good faith and partial replications—one study had participants administer their shocks in a virtual-reality system, for example—until 2007, when ABC collaborated with Santa Clara University psychologist Jerry Burger to replicate Milgram’s experiment for an episode of the TV show Basic Instincts titled “The Science of Evil,” pegged to Abu Ghraib.
Burger’s way around an ethical breach: In the most well-known experiment, he found, 80 percent of the participants who reached a 150-volt shock continued all the way to the end. “So what I said we could do is take people up to the 150-volt point, see how they reacted, and end the study right there,” he said. The rest of the setup was nearly identical to Milgram’s lab of the early 1960s (with one notable exception: “Milgram had a gray lab coat and I couldn’t find a gray, so I got a light blue.”)
At the end of the experiment, Burger was left with an obedience rate around the same as the one Milgram had recorded—proving, he said, not only that Milgram’s numbers had been accurate, but that his work was as relevant as ever. “[The results] didn’t surprise me,” he said, “but for years I had heard from my students and from other people, ‘Well, that was back in the 60s, and somehow how we’re more aware of the problems of blind obedience, and people have changed.’”
In recent years, though, much of the attention has focused less on supporting or discrediting Milgram’s statistics, and more on rethinking his conclusions. With a paper published earlier this month in the British Journal of Social Psychology, Matthew Hollander, a sociology Ph.D. candidate at the University of Wisconsin, is among the most recent to question Milgram’s notion of obedience. After analyzing the conversation patterns from audio recordings of 117 study participants, Hollander found that Milgram’s original classification of his subjects—either obedient or disobedient—failed to capture the true dynamics of the situation. Rather, he argued, people in both categories tried several different forms of protest—those who successfully ended the experiment early were simply better at resisting than the ones that continued shocking.
“Research subjects may say things like ‘I can’t do this anymore’ or ‘I’m not going to do this anymore,’” he said, even those who went all the way to 450 volts. “I understand those practices to be a way of trying to stop the experiment in a relatively aggressive, direct, and explicit way.”
It’s a far cry from Milgram’s idea that the capacity for evil lies dormant in everyone, ready to be awakened with the right set of circumstances. The ability to disobey toxic orders, Hollander said, is a skill that can be taught like any other—all a person needs to learn is what to say and how to say it.
* * *
In some ways, the conclusions Milgram drew were as much a product of their time as they were a product of his research. At the time he began his studies, the trial of Adolf Eichmann, one of the major architects of the Holocaust, was already in full swing. In 1963, the same year that Milgram published his studies, writer Hannah Arendt coined the phrase “the banality of evil” to describe Eichmann in her book on the trial, Eichmann in Jerusalem.
Milgram, who was born in New York City in 1933 to Jewish immigrant parents, came to view his studies as a validation of Arendt’s idea—but the Holocaust had been at the forefront of his mind for years before either of them published their work. “I should have been born into the German-speaking Jewish community of Prague in 1922 and died in a gas chamber some 20 years later,” he wrote in a letter to a friend in 1958. “How I came to be born in the Bronx Hospital, I’ll never quite understand.”
And in the introduction of his 1963 paper, he invoked the Nazis within the first few paragraphs: “Obedience, as a determinant of behavior, is of particular relevance to our time,” he wrote. “Gas chambers were built, death camps were guarded; daily quotas of corpses were produced … These inhumane policies may have originated in the mind of a single person, but they could only be carried out on a massive scale if a very large number of persons obeyed orders.”
Though the term didn’t exist at the time, Milgram was a proponent of what today’s social psychologists call situationism: the idea that people’s behavior is determined largely by what’s happening around them. “They’re not psychopaths, and they’re not hostile, and they’re not aggressive or deranged. They’re just people, like you and me,” Miller said. “If you put us in certain situations, we’re more likely to be racist or sexist, or we may lie, or we may cheat. There are studies that show this, thousands and thousands of studies that document the many unsavory aspects of most people.”
But continued to its logical extreme, situationism “has an exonerating effect,” he said. “In the minds of a lot of people, it tends to excuse the bad behavior … it’s not the person’s fault for doing the bad thing, it’s the situation they were put in.” Milgram’s studies were famous because their implications were also devastating: If the Nazis were just following orders, then he had proved that anyone at all could be a Nazi. If the guards at Abu Ghraib were just following orders, then anyone was capable of torture.
The latter, Reicher said, is part of why interest in Milgram’s work has seen a resurgence in recent years. “If you look at acts of human atrocity, they’ve hardly diminished over time,” he said, and news of the abuse at Abu Ghraib was surfacing around the same time that Yale’s archival material was digitized, a perfect storm of encouragement for scholars to turn their attention once again to the question of what causes evil.
He and his colleague Alex Haslam, the third co-editor of The Journal of Social Issues’ Milgram edition and a professor of psychology at the University of Queensland, have come up with a different answer. “The notion that we somehow automatically obey authority, that we are somehow programmed, doesn’t account for the variability [in rates of obedience] across conditions,” he said; in some iterations of Milgram’s study, the rate of compliance was close to 100 percent, while in others it was closer to zero. “We need an account that can explain the variability—when we obey, when we don’t.”
“We argue that the answer to that question is a matter of identification,” he continued. “Do they identify more with the cause of science, and listen to the experimenter as a legitimate representative of science, or do they identify more with the learner as an ordinary person? … You’re torn between these different voices. Who do you listen to?”
The question, he conceded, applies as much to the study of Milgram today as it does to what went on in his lab. “Trying to get a consensus among academics is like herding cats,” Reicher said, but “if there is a consensus, it’s that we need a new explanation. I think nearly everybody accepts the fact that Milgram discovered a remarkable phenomenon, but he didn’t provide a very compelling explanation of that phenomenon.”
What he provided instead was a difficult and deeply uncomfortable set of questions—and his research, flawed as it is, endures not because it clarifies the causes of human atrocities, but because it confuses more than it answers.
Or, as Miller put it: “The whole thing exists in terms of its controversy, how it’s excited some and infuriated others. People have tried to knock it down, and it always comes up standing.”