As a teenager, I never thought of myself as someone who had a problem with authority. I may not have liked what I was being told to do, and I may have complained, but it was not in my nature to say no. I had my first crisis of authority when I was 16. I was learning to drive, and I’d already failed my driving test—twice. After several more weeks of practicing and diligently studying the driver’s manual, I was taking my third and final test. If I failed that, I’d have to apply for a learner’s permit all over again and endure embarrassing months of being the only person my age without a license. So the pressure was on. With the examiner, a police officer, in the passenger’s seat and sweat on my brow, I carefully completed the entire course—and I thought I did well.
At the very end, the officer told me to pull over at a certain spot and park the car. And I had a moment of complete panic: the spot he’d indicated was just a few feet from a stop sign, and I remembered from the driver’s manual that it was illegal to park so close. Was this one last test? If I obeyed, I thought, I could be failed for breaking the law. So I hesitated and said, “Isn’t that too close to the stop sign?” The officer became furious and started berating me for my arrogance, reminding me that the manual said, “…unless directed otherwise by a uniformed officer of the law.” Tugging at his sleeve, he ranted, “What does this look like, my pajamas?” He went on and on until I was about ready to shrivel up and die, but in the end, he passed me anyway.
Strictly speaking, my dilemma was not a matter of whether to obey an authority, but rather which authority to obey—the police officer or the written law. Nevertheless, the fundamental conflict was between doing what I thought was right and doing what I was told. As stressful as I found that experience (I still cringe at the thought, more than 30 years later), it pales in comparison to a series of infamous experiments performed in the early 1960s. The experiments’ goal was to determine just how far people will stray from their ethical comfort zone in order to obey an authority.
Do As I Say
Stanley Milgram—the same psychologist whose research led to the six degrees of separation notion—was teaching at Yale University several years before he began working on social networks at Harvard. At that time, the world was still coming to grips with the trial of Adolf Eichmann, who was convicted in 1960 of crimes against humanity for his role in the holocaust; his defense had been that he was “just following orders.” Meanwhile, troops were being sent to Vietnam and public anxiety was high about whether, or to what extent, soldiers might again commit atrocities simply because someone in authority told them to do so. Milgram designed a series of psychological experiments to shed some light on this question; he later described these experiments in his book Obedience to Authority.
Milgram placed ads in a newspaper offering volunteers $4.50 for an hour of their time if they came to Yale to participate in an experiment about learning and memory. When a volunteer arrived at his scheduled time, he was met by an experimenter and a second volunteer. The experimenter explained that they were conducting a test of how physical punishment (in the form of electric shock) affected one’s ability to learn. One volunteer would be chosen randomly to be the “teacher,” and the other would be the “learner.” The learner was strapped into a chair with electrodes attached to their wrist; the teacher was seated in front of a console with a series of switches that controlled a shock generator. The switches were labeled with voltages ranging from 15 volts to 450 volts and descriptive terms for each (“Slight Shock” at 15v up through “Danger: Severe Shock” at 420v and “XXX” at the last two settings).
The teacher was given a sample shock of 45 volts (mild discomfort) so they’d know what the learner would feel. Then they were told to read a series of word pairs to the learner; the learner had to memorize which word came second in each pair. Next, the teacher was to read the first word in one of the pairs and wait for the learner to respond with the second word. If the learner got it right, nothing happened; if the learner made a mistake, the teacher was instructed to throw a switch to deliver a 15v shock. With each successive mistake, the teacher was to increase the voltage of the shock by 15v.
As the shocks became more and more severe, the learner would first groan, then complain loudly, then scream in agony, demand to be released, and eventually, stop responding altogether. When a teacher expressed reservations about continuing (as they nearly always did at one point or another), the experimenter insisted that the experiment must be completed and urged the teacher to keep going—assuring them that the shock was merely painful, not harmful. If the teacher became even more distressed, the experimenter reiterated that he would take full responsibility for the outcome and that the teacher would not get in trouble. By the end of the hour-long experiment, a great many of the volunteers were reduced to tears, and virtually all showed signs of considerable distress.
Only after the experiment was finished—when the teacher had administered the maximum possible shock, or flatly refused to comply—would the volunteer be told that the shocks weren’t real; the “learner” was an actor and the “teacher,” who was not chosen randomly after all, was the real test subject.
The Shocking Truth
The initial experiment showed something Milgram wasn’t expecting: despite their reservations, anxiety, and protests, 65% of the subjects administered the maximum possible shock of 450v, and none of them dropped out before 300v. Milgram then performed numerous variations on the experiment with hundreds of test subjects. These further experiments confirmed his original findings and also revealed some interesting details. For example, the subjects were less likely to deliver strong shocks when they were nearer to the “learner” or when the “experimenter” was not physically in the room (giving instructions by telephone). Subjects were more likely to deliver strong shocks if an actor posing as a second volunteer provided encouragement; conversely, if other “volunteers” disobeyed, so would the test subject. Perhaps most tellingly, though, when the subjects were given the freedom to choose shocks of any intensity, they nearly always chose the lowest settings (though a couple of people did administer the strongest shocks).
Although the “learners” weren’t shocked, other psychologists were—not because of Milgram’s results, but because of his methodology. His peers criticized him harshly for years afterward for causing such severe stress to his test subjects. Nevertheless, other researchers around the world have conducted comparable tests in the decades since, invariably yielding similar results to the ones Milgram obtained. But while academics debated the ethics of traumatizing volunteers, the public latched onto the troubling implications of Milgram’s findings. Basically, in a huge majority of cases, humans will obey an authority figure (though often under duress)—even if they firmly believe that doing so means causing someone else serious harm.
And yet, despite the fact that this principle has become common knowledge, most people (whether in a corporate environment, the military, or anywhere else) still believe that “I’m just doing my job” or “I’m just following orders” are valid excuses for inexcusable behavior. I have to wonder how many injuries, deaths, acts of terrorism, and even wars might have been prevented had their perpetrators found the courage to choose humanity over following the voice of authority.
Note: This is an updated version of an article that originally appeared on Interesting Thing of the Day on May 19, 2005.