The Milgram Shock Experiment, conducted by Stanley Milgram in the 1960s, tested obedience to authority. Participants were instructed to administer increasingly severe electric shocks to another person, who was actually an actor, as they answered questions incorrectly. Despite hearing the actor’s screams, most participants continued administering shocks, demonstrating the powerful influence of authority figures on behavior.
Key Takeaways
- Aim: The experiment studied obedience to authority, exploring whether individuals would obey instructions to harm another person because an authority figure told them to.
- Method: Participants were instructed by an authority figure to administer increasingly intense electric shocks to a “learner” (an actor), who feigned pain and distress.
- Results: A significant percentage of participants obeyed the authority figure and administered the maximum level of shocks, despite their apparent discomfort.
- Conclusion: The study demonstrated that ordinary people are surprisingly likely to obey authority figures, even when those orders conflict with their own moral beliefs.
- Ethics: Milgram’s use of deception raised serious questions about the ethics of psychological research. Participants believed they were causing real harm, leading to potential emotional distress.
Milgram’s Experiment (1963)
Background
When Stanley Milgram first conceived his experiments, he had intended to compare levels of obedience in the United States with those in Germany.
Given the aftermath of Nazi atrocities, Milgram, like many at the time, suspected that something unique in German culture might underlie higher levels of submission to authority.
As a baseline, he began by testing American participants at Yale, fully expecting they would prove significantly less obedient than the Germans he planned to study.
To his surprise, however, the U.S. results were already so unexpectedly high that Milgram ultimately abandoned his planned cross-cultural comparison.
Far from confirming any nation-specific tendency, these findings instead suggested a more universal human vulnerability to authority pressures.
Aim
The study was designed to measure how far participants would go in obeying an authority figure who instructed them to perform acts that conflicted with their personal conscience.
Specifically, it aimed to quantify the level of shock participants were willing to administer to another person under the guise of a learning experiment when instructed to do so by an authority figure.
Milgram also investigated the conditions under which people obey or disobey authority and the psychological mechanisms (reasons) behind obedience and disobedience.
Could it be that Eichmann and his million accomplices in the Holocaust were just following orders? Could we call them all accomplices?” (Milgram, 1974).
Sample
- Size: The study involved 40 male participants aged between 20 and 50 years.
- Method: Participants were recruited through newspaper advertisements and direct mail solicitation.
All subjects believed they were voluntarily participating in a study on memory and learning at Yale University. This method is known as volunteer or self-selecting sampling.
- Demographics: Participants were drawn from New Haven and surrounding communities.
The sample included a wide range of occupations, including postal clerks, high school teachers, salesmen, engineers, and laborers.
Participants ranged in educational level from those who had not finished elementary school to those with doctorate and other professional degrees.
- Compensation: Participants were paid $4.50 for their participation in the experiment.
However, they were told that the payment was simply for coming to the laboratory, regardless of what happened after they arrived.
Procedure
The procedure involved pairing participants with a confederate (Mr. Wallace), assigning roles through a rigged draw. The participant (always the teacher) was instructed to administer electric shocks to the confederate (learner) for incorrect answers to a memory task.
- The participant and the confederate drew slips from a hat to determine their roles.
- The drawing was rigged so that both slips contained the word “Teacher.”
- The ‘true’ participant was always first to choose.
- This ensured that the naive subject (real participant) was always assigned the role of teacher, while the confederate was always the learner.
Before ‘drawing lots’ to decide who became the teacher and who became the learner Milgram told the participants about the effects of punishment on learning:
We know very little about the effects of punishment on learning. This is because almost no scientific studies have been conducted (on human beings).
We don’t know how much punishment is best for learning/whether it is beneficial to learning; We also don’t know how much difference it makes as to who is giving the punishment:
So in this study, we are bringing together people from different occupations (to test this out); We want to know what effect different people have on each other as teachers and learners.
The learner (Mr. Wallace) was taken into a room and strapped into an electric chair apparatus.
The teacher (real participant) and experimenter (a confederate called Mr. William) went into a separate room next door that contained an electric shock generator.

The ‘Learning Task’
The teacher real participant) was given a preliminary series of 10 words to read to the learner (confederate), with 7 predetermined wrong answers, reaching 105 volts.
After the practice round, a second list was given, and the teacher was told to repeat the procedure until all word pairs were learned correctly.
The participant (teacher) read a second list of word pairs to the learner. The participant then read one word from each pair and provided four possible options for the matching word.
The learner had to indicate which word had been originally paired with the first word by pressing one of four switches.
This task served as the pretext for administering shocks, allowing the experimenters to study obedience to authority in a controlled setting.
Each incorrect answer resulted in a shock, while a correct answer moved the process to the next word.
Fake Shock Generator
The shocks in Stanley Milgram’s obedience experiments were not real. The “learners” were actors who were part of the experiment and did not actually receive any shocks.
However, the “teachers” (the real participants of the study) believed the shocks were real, which was crucial for the experiment to measure obedience to authority figures even when it involved causing harm to others.
The participant was given a mild electric shock of 45v to the wrist to convince them that the shocks were genuine. Milgram watched through a one-way mirror.
- The device consisted of 30 lever switches or bttons.
- Each switch was clearly labeled with a voltage level.
- The voltage range spanned from 15 volts to 450 volts.
- The voltage increased by 15-volt increments between each switch.
- When a switch was pressed, a red light would illuminate, an electric buzzing sound was emitted, and a blue light labeled “voltage energizer” would light up.
- The voltage levels were labeled from “Slight Shock” to “XXX”.

Learner (confederate)
The learner (Mr. Wallace) was a confederate (stooge) who pretended to be a real participant. He was 47 years old, mild-minded, Irish-American, and was an accountant in real life.
The learner was taken into a separate room and strapped into an electric chair apparatus. He had electrodes attached to his wrist with paste (to avoid blistering).
The experimenter explained that the straps were to prevent excessive movement.

The learner’s responses were predetermined, with a schedule of approximately three wrong answers to one correct answer. This standardized protocal was used for all 40 participants.
The learner continued to provide answers (mostly incorrect) up until the 300v shock, and makes noises of pain when he receives a shock.
When the 300v shock is administered, the learner pounds on the room’s wall, which is heard by the participant.
- At 75 volts, the learner emitted the first audible grunts of pain.
- By 120 volts, the learner’s vocal protests became louder and more insistent.
- At 150 volts, the learner began to complain specifically about a heart condition.
- Around 300 volts, the learner pounded on the wall, indicating significant distress but ceasing verbal responses.
- After approximately 330 volts, the learner became silent, creating the impression that they were either unconscious or unresponsive.
It is crucial to remember that these were simulated responses; no actual electric shocks were delivered.
The purpose of this staged distress was to measure the participant’s willingness to obey authority even when they perceived they were causing harm.
Experimenter (confederate authority figure)
There was also an “experimenter” dressed in a gray lab coat, played by an actor (a 31-year-old male school biology teacher in a coat called ‘Mr. William’. He wore a grey technician coat and had a stern manner).
The experimenter (Mr. Williams) instructs the teacher (real participant) to administer an electric shock each time the learner makes a mistake, increasing the shock level by 15 volts with each error.
He advises the participants to allow 5-10 seconds before considering no response as a wrong answer, and to increase the shock level one step each time the learner fails to respond correctly. The participant gave the shock by pressing a button on the shock generator.
Additionally, the experimenter tells the teacher to announce the voltage level before administering each shock.
When the teacher refused to administer a shock, the experimenter (Mr. Williams) instructed a series of standardized orders/prods to ensure they continued.
There were four standardized, scripted prods, and if one was not obeyed, then the experimenter (Mr. Williams) read out the next prod, and so on.
If a prod was unsuccessful or the participant insisted on stopping, the experimenter moved on to the next one, and so on.
The tone was firm but not impolite. The prods were repeated if the participant showed reluctance to continue.
- Prod 1: Please continue / please go on.
- Prod 2: The experiment requires you to continue.
- Prod 3: It is absolutely essential that you continue.
- Prod 4: You have no other choice but to continue.
These prods were to be used in order, and begun afresh for each new attempt at defiance (Milgram, 1974, p. 21).
If the final prod was unsuccessful or the participant stopped reading words, the study ended.
There were special prods if physical injury was asked about;
- Although the shocks may be painful, there is no permanent tissue damage, so please
go on. - Whether the learner likes it or not, you must go on until he has learned all the word
pairs correctly. So please go on.
Results
Before conducting his experiment, Milgram surveyed psychiatrists who predicted only 0.1% of participants would administer maximum shocks, expecting most would refuse between 150-300 volts once hearing the learner’s protests or heart concerns.
However, the actual results starkly contradicted these expert forecasts:
- 65% (two-thirds) of participants (i.e., teachers) continued to the highest level of 450 volts.
- 100% of the participants continued at least to 300 volts.
- A total of 14 “defiant” participants stopped before reaching the highest levels (5 at 300v, 4 at 315v, 2 at 330v, and 1 each at 345v, 360v, and 375v).
Milgram did more than one experiment, he carried out at least 18 variations, each time manipulating features of the situation (e.g., proximity to the learner, setting prestige, presence of peers) to see how they influenced obedience.
Signs of Tension and Resistance
While these results suggest a remarkably high overall obedience rate, they do not imply that participants found it easy or comfortable to follow orders:
- Every participant at some point questioned the procedure, and many openly pleaded to stop or offered to return their payment rather than continue hurting the learner.
- 14 out of 40 individuals exhibited nervous laughter or smiling—often interpreted as a response to intense psychological conflict. One participant famously experienced a seizure due to severe stress.
- Observed behaviors included sweating, trembling, stuttering, biting lips, digging fingernails into flesh, and uncontrollable laughter episodes, all signs of acute internal conflict between the authority figure’s instructions and the participants’ personal conscience.
In interviews following the sessions, participants frequently described feeling tormented by what they believed they were doing, yet felt unable to disobey the experimenter’s prods.
This illustrates Milgram’s core point: the conflict between adhering to an authority’s commands and upholding one’s own moral standards can trigger extreme anxiety.
Post-Experimental Ratings
During debriefing, subjects were asked to rate the pain of the final shocks on a 14-point scale:
- The modal response was 14 (“Extremely painful”).
- The mean rating was 13.42, reinforcing the idea that participants perceived they were administering severe harm to the learner.
These findings highlight why Milgram’s study caused such widespread controversy.
Many participants were visibly distressed and desperate to stop, yet the powerful situational force of an authority figure kept most administering shocks right through the learner’s protests, or silence.
Conclusion
- People appear to be more obedient to authority figures than we might expect. Ordinary individuals are likely to follow orders given by an authority figure, even to the extent of potentially causing harm to an innocent human being.
- When people are given orders to act destructively they will experience high levels of stress and anxiety.
- People are willing to harm someone if responsibility is taken away and passed on to someone else.
Situational factors affected obedience:
The individual explanation attributes obedience to personal traits, while the situational explanation more realistically recognizes that environmental factors strongly influenced participants’ behavior.
Some aspects of the situation that may have influenced their behavior include the formality of the location, the behavior of the experimenter, and the fact that it was an experiment for which they had volunteered and been paid.
- Institutional authority: The experiment’s association with Yale University lent it significant credibility and legitimacy.
- Authoritative uniform: The experimenter wore a gray technician’s lab coat portraying authority and scientific status.
- Buffers from the consequences: The physical separation from the learner reduced the emotional impact of the participants’ actions.
- Divided responsibility: The presence of the experimenter allowed participants to feel they were not solely responsible for their actions.
- Gradual nature of the task: The incremental increase in shock intensity made it harder for participants to determine a clear point to refuse.
- Limited time for reflection: The rapid progression of events gave participants little opportunity to carefully consider their actions.
- Contractual obligation: Having agreed to participate, subjects felt a commitment to see the experiment through.
People tend to obey orders from other people if they recognize their authority as morally right and/or legally based.
This response to legitimate authority is learned in a variety of situations, for example in the family, school, and workplace.
Milgram summed up in the article “The Perils of Obedience” (Milgram 1974), writing:
“The legal and philosophic aspects of obedience are of enormous import, but they say very little about how most people behave in concrete situations.
I set up a simple experiment at Yale University to test how much pain an ordinary citizen would inflict on another person simply because he was ordered to by an experimental scientist.
Stark authority was pitted against the subjects’ [participants’] strongest moral imperatives against hurting others, and, with the subjects’ [participants’] ears ringing with the screams of the victims, authority won more often than not.
The extreme willingness of adults to go to almost any lengths on the command of an authority constitutes the chief finding of the study and the fact most urgently demanding explanation.”
Milgram’s Agency Theory
Milgram (1974) explained the behavior of his participants by suggesting that people have two states of behavior when they are in a social situation:
- The autonomous state – people direct their own actions, and they take responsibility for the results of those actions.
- The agentic state – people allow others to direct their actions and then pass off the responsibility for the consequences to the person giving the orders. In other words, they act as agents for another person’s will.
Milgram suggested that two things must be in place for a person to enter the agentic state:
- The person giving the orders is perceived as being qualified to direct other people’s behavior. That is, they are seen as legitimate.
- The person being ordered about is able to believe that the authority will accept responsibility for what happens.
According to Milgram, when in this agentic state, the participant in the obedience studies “defines himself in a social situation in a manner that renders him open to regulation by a person of higher status.
In this condition the individual no longer views himself as responsible for his own actions but defines himself as an instrument for carrying out the wishes of others” (Milgram, 1974, p. 134).
Agency theory says that people will obey an authority when they believe that the authority will take responsibility for the consequences of their actions. This is supported by some aspects of Milgram’s evidence.
For example, when participants were reminded that they had responsibility for their own actions, almost none of them were prepared to obey.
In contrast, many participants who were refusing to go on did so if the experimenter said that he would take responsibility.
According to Milgram (1974, p. 188):
“The behavior revealed in the experiments reported here is normal human behavior but revealed under conditions that show with particular clarity the danger to human survival inherent in our make-up.
And what is it we have seen? Not aggression, for there is no anger, vindictiveness, or hatred in those who shocked the victim….
Something far more dangerous is revealed: the capacity for man to abandon his humanity, indeed, the inevitability that he does so, as he merges his unique personality into larger institutional structures.”
Milgram Experiment Variations
The Milgram experiment was carried out many times whereby Milgram (1965) varied the basic procedure (changed the IV). By doing this Milgram could identify which factors affected obedience (the DV).
Obedience was measured by how many participants shocked to the maximum 450 volts (65% in the original study).
Stanley Milgram conducted a total of 23 variations (also called conditions or experiments) of his original obedience study:
In total, 636 participants were tested in 18 variation studies conducted between 1961 and 1962 at Yale University.
Uniform (20% obedience)
In the original baseline study – the experimenter wore a gray lab coat to symbolize his authority (a kind of uniform).
The lab coat worn by the experimenter in the original study served as a crucial symbol of scientific authority that increased obedience.
The lab coat conveyed expertise and legitimacy, making participants see the experimenter as more credible and trustworthy.
Milgram carried out a variation in which the experimenter was called away because of a phone call right at the start of the procedure.
The role of the experimenter was then taken over by an ‘ordinary member of the public’ ( a confederate) in everyday clothes rather than a lab coat. The obedience level dropped to 20%.
Change of Location: The Mountain View Facility Study (1963, unpublished)
(47.5% obedience)
Milgram conducted this variation in a set of offices in a rundown building, claiming it was associated with “Research Associates of Bridgeport” rather than Yale.
The lab’s ordinary appearance was designed to test if Yale’s prestige encouraged obedience. Participants were led to believe that a private research firm experimented.
In this non-university setting, obedience rates dropped to 47.5% compared to 65% in the original Yale experiments. This suggests that the status of location affects obedience.
Private research firms are viewed as less prestigious than certain universities, which affects behavior. It is easier under these conditions to abandon the belief in the experimenter’s essential decency.
The impressive university setting reinforced the experimenter’s authority and conveyed an implicit approval of the research.
Milgram filmed this variation for his documentary Obedience, but did not publish the results in his academic papers.
The study only came to wider light when archival materials, including his notes, films, and data, were studied by later researchers like Perry (2013) in the decades after Milgram’s death.
Two Teacher Condition (92.5% obedience)
When participants could instruct an assistant (confederate) to press the switches, 92.5% shocked to the maximum of 450 volts.
Allowing the participant to instruct an assistant to press the shock switches diffused personal responsibility and likely reduced perceptions of causing direct harm.
By attributing the actions to the assistant rather than themselves, participants could more easily justify shocking to the maximum 450 volts, reflected in the 92.5% obedience rate.
When there is less personal responsibility, obedience increases. This relates to Milgram’s Agency Theory.
Touch Proximity Condition (30% obedience)
The teacher had to force the learner’s hand down onto a shock plate when the learner refused to participate after 150 volts. Obedience fell to 30%.

Forcing the learner’s hand onto the shock plate after 150 volts physically connected the teacher to the consequences of their actions. This direct tactile feedback increased the teacher’s personal responsibility.
No longer shielded from the learner’s reactions, the proximity enabled participants to more clearly perceive the harm they were causing, reducing obedience to 30%.
Physical distance and indirect actions in the original setup made it easier to rationalize obeying the experimenter.
The participant is no longer buffered/protected from seeing the consequences of their actions.
Social Support Condition (10% obedience)
Here, the real participant is one of three teachers in the room: the other two are confederates. All three are ostensibly administering shocks in turn.
Early in the session, both confederates refuse to continue at certain voltage levels (150 volts and 210 volts).
With these confederates modeling disobedience, the real participant had clear social support to defy the experimenter.
Seeing peers refuse caused obedience to plummet, with only 10% of participants continuing to the end.
Contradictory Authorities (0% obedience)
Another variation included two experimenters issuing conflicting commands in the same room.
At a key moment, one experimenter would insist the participant continue, while the other would say to stop or express serious doubt.
Faced with contradictory orders from equally legitimate authorities, participants uniformly refused to proceed, 0% continued to the end.
Because the authority figures no longer presented a single, coherent directive, participants found it easier to disobey.
Absent Experimenter Condition (20.5% obedience)
It is easier to resist the orders from an authority figure if they are not close by. When the experimenter instructed and prompted the teacher by telephone from another room, obedience fell to 20.5%.
Many participants cheated and missed out on shocks or gave less voltage than ordered by the experimenter. The proximity of authority figures affects obedience.
The physical absence of the authority figure enabled participants to act more freely on their own moral inclinations rather than the experimenter’s commands. This highlighted the role of an authority’s direct presence in influencing behavior.
A key reason the obedience studies fascinate people is Milgram presented them as a scientific experiment, contrasting himself as an “empirically grounded scientist” compared to philosophers. He claimed he systematically varied factors to alter obedience rates.
However, recent scholarship using archival records shows Milgram’s account of standardizing the procedure was misleading.
For example, he published a list of standardized prods the experimenter used when participants questioned continuing. Milgram said these were delivered uniformly in a firm but polite tone.
Analyzing audiotapes, Gibson (2013) found considerable variation from the published protocol – the prods differed across trials. The point is not that Milgram did poor science, but that the archival materials reveal the limitations of the textbook account of his “standardized” procedure.
The qualitative data like participant feedback, Milgram’s notes, and researchers’ actions provide a fuller, messier picture than the obedience studies’ “official” story.
For psychology students, this shows how scientific reporting can polish findings in a way that strays from the less tidy reality.
Critical Evaluation
Inaccurate description of the prod methodology:
A key reason obedience studies fascinate people is Milgram (1974) presented them as a scientific experiment, contrasting himself as an “empirically grounded scientist” compared to philosophers.
He claimed he systematically varied factors to alter obedience rates.
However, recent scholarship using archival records shows Milgram’s account of standardizing the procedure was misleading.
For example, he published a list of standardized prods the experimenter used when participants questioned continuing. Milgram said these were delivered uniformly in a firm but polite tone (Gibson, 2013; Perry, 2013; Russell, 2010).
Perry’s (2013) archival research revealed another discrepancy between Milgram’s published account and the actual events. Milgram claimed standardized prods were used when participants resisted, but Perry’s audiotape analysis showed the experimenter often improvised more coercive prods beyond the supposed script.
This off-script prodding varied between experiments and participants, and was especially prevalent with female participants where no gender obedience difference was found – suggesting the improvisation influenced results.
Gibson (2013) and Russell (2009) corroborated the experimenter’s departures from the supposed fixed prods.
Prods were often combined or modified rather than used verbatim as published.
Analyzing audiotapes, Gibson (2013) found considerable variation from the published protocol – the prods differed across trials.
The point is not that Milgram did poor science, but that the archival materials reveal the limitations of the textbook account of his “standardized” procedure.
The qualitative data like participant feedback, Milgram’s notes, and researchers’ actions provide a fuller, messier picture than the obedience studies’ “official” story.
For psychology students, this shows how scientific reporting can polish findings in a way that strays from the less tidy reality.
Russell speculated the improvisation aimed to achieve outcomes the experimenter believed Milgram wanted. Milgram seemed to tacitly approve of the deviations by not correcting them when observing.
This raises significant issues around experimenter bias influencing results, lack of standardization compromising validity, and ethical problems with Milgram misrepresenting procedures.
Milgram’s experiment lacked external validity:
The Milgram studies were conducted in laboratory-type conditions, and we must ask if this tells us much about real-life situations.
We obey in a variety of real-life situations that are far more subtle than instructions to give people electric shocks, and it would be interesting to see what factors operate in everyday obedience.
The sort of situation Milgram investigated would be more suited to a military context.
Orne and Holland (1968) accused Milgram’s study of lacking ‘experimental realism,”’ i.e.,” participants might not have believed the experimental set-up they found themselves in and knew the learner wasn’t receiving electric shocks.
“It’s more truthful to say that only half of the people who undertook the experiment fully believed it was real, and of those two-thirds disobeyed the experimenter,” observes Perry (p. 139).
Milgram’s sample was biased:
- The participants in Milgram’s study were all male. Do the findings transfer to females?
- Milgram’s study cannot be seen as representative of the American population as his sample was self-selected. This is because they became participants only by electing to respond to a newspaper advertisement (selecting themselves).
- They may also have a typical “volunteer personality” – not all the newspaper readers responded so perhaps it takes this personality type to do so.
Yet a total of 636 participants were tested in 18 separate experiments across the New Haven area, which was seen as being reasonably representative of a typical American town.
Milgram’s findings have been replicated in a variety of cultures and most lead to the same conclusions as Milgram’s original study and in some cases see higher obedience rates.
However, Smith and Bond (1998) point out that with the exception of Jordan (Shanab & Yahya, 1978), the majority of these studies have been conducted in industrialized Western cultures, and we should be cautious before we conclude that a universal trait of social behavior has been identified.
Selective reporting of experimental findings:
Perry (2013) found Milgram omitted findings from some obedience experiments he conducted, reporting only results supporting his conclusions.
A key omission was the Relationship condition (conducted in 1962 but unpublished), where participant pairs were relatives or close acquaintances.
When the learner protested being shocked, most teachers disobeyed, contradicting Milgram’s emphasis on obedience to authority.
Perry argued Milgram likely did not publish this 85% disobedience rate because it undermined his narrative and would be difficult to defend ethically since the teacher and learner knew each other closely.
Milgram’s selective reporting biased interpretations of his findings.
His failure to publish all his experiments raises issues around researchers’ ethical obligation to completely and responsibly report their results, not just those fitting their expectations.
Unreported analysis of participants’ skepticism and its impact on their behavior:
Perry (2013) found archival evidence that many participants expressed doubt about the experiment’s setup, impacting their behavior. This supports Orne and Holland’s (1968) criticism that Milgram overlooked participants’ perceptions.
Incongruities like apparent danger, but an unconcerned experimenter likely cued participants that no real harm would occur. Trust in Yale’s ethics reinforced this.
Yet Milgram did not publish his assistant’s analysis showing participant skepticism correlated with disobedience rates and varied by condition.
Obedient participants were more skeptical that the learner was harmed. This selective reporting biased interpretations. Additional unreported findings further challenge Milgram’s conclusions.
This highlights issues around thoroughly and responsibly reporting all results, not just those fitting expectations.
It shows how archival evidence makes Milgram’s study a contentious classic with questionable methods and conclusions.
Comparisons to the Holocaust (Nuanced Views)
Milgram himself drew an early parallel between his laboratory findings and the behavior of Nazis during the Holocaust, famously asking whether individuals like Adolf Eichmann were “just following orders.”
However, many historians and psychologists argue that using Milgram’s study as a direct explanation of atrocities such as the Holocaust oversimplifies the historical reality (Waller, 2002; Nicholson, 2015).
In Milgram’s setup, participants were explicitly told there would be no lasting harm to the learner, and the situation was fairly brief, with an authority figure visibly supervising the session.
In contrast, real-world perpetrators in genocide or war crimes:
- Were keenly aware they were killing or harming actual people, sometimes motivated by racist or hateful ideologies.
- Operated over prolonged periods in an environment saturated with propaganda, sustained brutality, and complex group dynamics.
- Often faced severe consequences for disobedience, alongside intense social pressures to conform.
These differences have led some scholars to caution that while Milgram’s experiment sheds light on situational pressures fostering obedience, it does not fully capture the ideological, cultural, and institutional factors that shape large-scale atrocities (Waller, 2002; Perry, 2013).
Thus, while Milgram’s work illuminates how ordinary people can harm others under orders, historians stress that events like the Holocaust involve more complex factors than simple obedience.
Key Takeaway: Milgram’s findings are an important lens for examining obedience but should be viewed as only one piece of the puzzle when interpreting real-world war crimes and genocide, where factors like hatred, dehumanization, and protracted coercion play significant roles (Nicholson, 2015).
Engaged Followership (Social Identification) Theory
Traditional interpretations of Milgram’s findings portray participants as blindly obeying an authority figure.
However, more recent scholarship by Alexander Haslam and Stephen Reicher proposes an engaged followership model (Haslam & Reicher, 2017).
The engaged followership model underscores the role of group identity and belief in the leader’s cause, suggesting that Milgram’s participants were not entirely passive or coerced, they were often motivated by a perceived positive aim.
- Participants continued administering shocks not just because they were ordered to, but because they believed they were contributing to a worthy scientific cause, improving knowledge or helping the researcher achieve important results.
- Archival analyses suggest Milgram’s prods like “The experiment requires you to continue” were actually more effective than the direct command “You have no other choice, you must go on”. People often resisted outright directives but were more cooperative when appealed to in terms of shared goals and scientific value.
This reinterpretation implies that obedience in Milgram’s studies is partly about identification with the experimenter’s mission.
Participants may have felt they were collaborating with the authority rather than simply caving in to orders.
In real-life settings, people similarly tend to comply with leaders whose goals they find legitimate and meaningful.
Engaged followership thus enriches our understanding of the study, shifting the narrative from “people will obey any authority blindly” to “people will obey when they strongly identify with the authority’s purpose or values.”
Personality vs. Situation
Milgram famously emphasized situational factors, like the authority figure’s presence, prestigious setting, and structured procedure, as key drivers of obedience.
Indeed, the high compliance rates from his baseline study underscore how powerful these contextual cues can be.
Yet:
- About 35% of participants did defy the experimenter, raising questions about who resists and why (Blass, 1999).
- Follow-up interviews revealed that some individuals drew on personal moral convictions or strong discomfort to exit the study early, while those who obeyed often justified their actions by saying they were “doing their job” or “helping science” (Milgram, 1974).
- Research points to modest correlations between obedience and personality measures such as authoritarianism or empathy, indicating that people higher in authoritarian traits may be more inclined to follow harmful orders (Blass, 1999)
Thus, while situation often outweighs individual differences, personality does not become irrelevant.
People vary in their threshold for refusing unjust directives, and traits like empathy or moral conviction can be deciding factors.
Modern perspectives therefore highlight the interaction between external pressures and internal predispositions, suggesting that some individuals will more readily resist authority in morally charged scenarios.
Recognizing both situational forces and personal traits offers a more nuanced understanding of Milgram’s results.
Although the experimental setup was powerful enough to make most people continue, personal factors sometimes tipped the balance in favor of disobedience .
Ethical Issues
What are the potential ethical concerns associated with Milgram’s research on obedience?
It is important to remember that formal ethical guidelines for psychological research were not yet in place at the time Milgram conducted his studies in the early 1960s.
Consequently, judging his methods by today’s criteria means we are applying rules and protections that did not exist during his era.
While this does not excuse the potential harm some participants experienced, it does help us understand why Milgram’s study unfolded with less participant protections than we would deem acceptable now.
Consent
Baumrind (1964) criticized the ethics of Milgram’s research as participants were prevented from giving their informed consent to take part in the study.
Participants assumed the experiment was benign and expected to be treated with dignity.
As a result of studies like Milgram’s, the APA and BPS now require researchers to give participants more information before they agree to take part in a study.
Deception
The participants actually believed they were shocking a real person and were unaware the learner was a confederate of Milgram’s.
However, Milgram argued that “illusion is used when necessary in order to set the stage for the revelation of certain difficult-to-get-at-truths.”
Milgram also interviewed participants afterward to find out the effect of the deception. Apparently, 83.7% said that they were “glad to be in the experiment,” and 1.3% said that they wished they had not been involved.
Protection of participants
Participants were exposed to extremely stressful situations that may have the potential to cause psychological harm. Many of the participants were visibly distressed (Baumrind, 1964).
Signs of tension included trembling, sweating, stuttering, laughing nervously, biting lips and digging fingernails into palms of hands. Three participants had uncontrollable seizures, and many pleaded to be allowed to stop the experiment.
Milgram described a businessman reduced to a “twitching stuttering wreck” (1963, p. 377),
In his defense, Milgram argued that these effects were only short-term. Once the participants were debriefed (and could see the confederate was OK), their stress levels decreased.
“At no point,” Milgram (1964) stated, “were subjects exposed to danger and at no point did they run the risk of injurious effects resulting from participation” (p. 849).
To defend himself against criticisms about the ethics of his obedience research, Milgram cited follow-up survey data showing that 84% of participants said they were glad they had taken part in the study.
Milgram used this to claim that the study caused no serious or lasting harm, since most participants retrospectively did not regret their involvement.
Yet archival accounts show many participants endured lasting distress, even trauma, refuting Milgram’s insistence the study caused only fleeting “excitement.” By not debriefing all, Milgram misled participants about the true risks involved (Perry, 2013).
Debrief
However, Milgram did debrief the participants fully after the experiment and also followed up after a period of time to ensure that they came to no harm.
Milgram debriefed all his participants straight after the experiment and disclosed the true nature of the experiment.
Participants were assured that their behavior was common, and Milgram also followed the sample up a year later and found no signs of any long-term psychological harm.
The majority of the participants (83.7%) said that they were pleased that they had participated, and 74% had
learned something of personal importance.
Perry’s (2013) archival research found Milgram misrepresented debriefing – around 600 participants were not properly debriefed soon after the study, contrary to his claims.
Many only learned no real shocks occurred when reading a mailed study report months later, which some may have not received.
Milgram likely misreported debriefing details to protect his credibility and enable future obedience research.
This raises issues around properly informing and debriefing participants that connect to APA ethics codes developed partly in response to Milgram’s study.
Right to Withdrawal
The British Psychological Society (BPS) states that researchers should make it plain to participants that they are free to withdraw at any time (regardless of payment).
When expressing doubts, the experimenter assured them that all was well. Trusting Yale scientists, many took the experimenter at his word that “no permanent tissue damage” would occur, and continued administering shocks despite reservations.
Did Milgram give participants an opportunity to withdraw? The experimenter gave four verbal prods which mostly discouraged withdrawal from the experiment:
- Please continue.
- The experiment requires that you continue.
- It is absolutely essential that you continue.
- You have no other choice, you must go on.
Milgram argued that they were justified as the study was about obedience, so orders were necessary.
Milgram pointed out that although the right to withdraw was made partially difficult, it was possible as 35% of participants had chosen to withdraw.
Milgram’s procedures generated intense debate regarding the rights and well-being of research participants.
Evolution of Ethical Standards
While not a “contribution to psychology” in the traditional sense, Milgram’s obedience experiments sparked significant debate about the ethics of psychological research.
Institutional Review Boards (IRBs) and codes of ethical conduct (such as the American Psychological Association’s current standards) only emerged later, partly because of controversies sparked by Milgram’s work.
Alongside other prominent studies of the time (such as the Stanford Prison Experiment), Milgram’s work helped highlight the need for:
- Informed Consent and Clear Disclosure: Modern guidelines require that participants be given a clear understanding of a study’s nature and potential risks, along with the right to withdraw at any stage.
- Debriefing Protocols: Where deception is used, researchers must fully inform participants afterward about the true purpose of the research, ensuring no lasting harm or confusion.
- Independent Oversight: IRBs (or equivalent ethics committees) now review proposed studies, assessing risk levels and verifying adherence to ethical standards.
These measures evolved partly due to the outcry following Milgram’s findings, where many participants believed they had inflicted serious harm and showed signs of considerable distress.
Replications
Direct replications have not been possible due to current ethical standards.
However, several researchers have conducted partial replications and variations that aim to reproduce some aspects of Milgram’s methods ethically.
Across decades, populations, and study designs, partial replications reinforce Milgram’s central conclusion: situational pressures can powerfully compel ordinary individuals to engage in actions that conflict with their personal values.
Burger (2009)
One important replication was conducted by Jerry Burger in 2009.
Burger’s partial replication included several safeguards to protect participant welfare, such as screening out high-risk individuals, repeatedly reminding participants they could withdraw, and stopping at the 150-volt shock level.
This was the point where Milgram’s participants first heard the learner’s protests.
Given that 79% of Milgram’s participants who crossed 150 volts went on to the maximum 450, Burger argued that stopping at 150 volts still offered a valid estimate of overall obedience.
He found that 70% of participants continued to 150 volts in his study, compared to 82.5% in Milgram’s comparable condition.
Thomas Blass (1999)
Another major examination was conducted by Thomas Blass.
He analyzed replication studies from 1963 to 1985 to see if public awareness of Milgram’s research had changed obedience rates.
Blass found no significant drop over time, suggesting that growing familiarity with the original experiments did not reduce compliance, undercutting the idea of “enlightenment effects.”
Blass’s work also confirmed that obedience generally remained robust across different eras and participant groups.
French “Game of Death” and Polish Replications
More recent studies continue to highlight obedience’s persistence:
- French Reality TV (2010): A program titled The Game of Death replicated the essence of Milgram’s study as a high-stakes “game show.” Remarkably, 80% of participants delivered what they believed were potentially fatal shocks under the influence of a television host and a cheering audience.
- Polish Study (2015–2017): Researchers in Poland tested participants with lower-voltage shocks yet a similar structure, finding 90% still willing to comply with the experimenter’s directives to administer the highest level (Doliński et al., 2017).
These modern replications confirm that, even today, a strikingly large proportion of individuals will obey an authority figure under controlled conditions, albeit with updated ethical safeguards.
Gender, Situational Factors, and Personality
Many replications also consider gender differences.
Milgram found similar obedience rates in men and women, and most subsequent studies corroborate this.
Nevertheless, Kilham and Mann (1974) reported somewhat lower obedience in female participants, showing that results can vary with methodology and cultural context.
Situational factors remain profoundly influential. Burger (2009) observed that a defiant peer lowers, but does not eliminate, obedience, backing Milgram’s claim that social support significantly reduces compliance.
Replications also highlight how stepwise escalations in demands can push participants further than they might initially expect.
Regarding personality, while traits like high empathy or desire for control may delay compliance, they do not necessarily prevent participants from eventually following through.
Authoritarian tendencies (Elms, 2009) sometimes show a modest correlation with obedience, consistent with Milgram’s original ideas.
Milgram (1963) Audio Clips
Below you can also hear some of the audio clips taken from the video that was made of the experiment. Just click on the clips below.
References
Baumrind, D. (1964). Some thoughts on ethics of research: After reading Milgram’s” Behavioral study of obedience.”. American Psychologist, 19(6), 421.
Blass, T. (1999). The Milgram paradigm after 35 years: Some things we now know about obedience to authority 1. Journal of Applied Social Psychology, 29(5), 955-978.
Brannigan, A., Nicholson, I., & Cherry, F. (2015). Introduction to the special issue: Unplugging the Milgram machine. Theory & Psychology, 25(5), 551-563.
Burger, J. M. (2009). Replicating Milgram: Would people still obey today? American Psychologist, 64, 1–11.
Doliński, D., Grzyb, T., Folwarczny, M., Grzybała, P., Krzyszycha, K., Martynowska, K., & Trojanowski, J. (2017). Would you deliver an electric shock in 2015? Obedience in the experimental paradigm developed by Stanley Milgram in the 50 years following the original studies. Social Psychological and Personality Science, 8(8), 927-933.
Elms, A. C. (2009). Obedience lite. American Psychologist, 64(1), 32–36.
Gibson, S. (2013). Milgram’s obedience experiments: A rhetorical analysis. British Journal of Social Psychology, 52, 290–309.
Gibson, S. (2017). Developing psychology’s archival sensibilities: Revisiting Milgram’s obedience’ experiments. Qualitative Psychology, 4(1), 73.
Griggs, R. A., Blyler, J., & Jackson, S. L. (2020). Using research ethics as a springboard for teaching Milgram’s obedience study as a contentious classic. Scholarship of Teaching and Learning in Psychology, 6(4), 350.
Haslam, S. A., & Reicher, S. D. (2017). Rethinking the psychology of tyranny: The BBC prison study, social identity, and the power of situated focus. APS Observer.
Haslam, S. A., & Reicher, S. D. (2018). A truth that does not always speak its name: How Hollander and Turowetz’s findings confirm and extend the engaged followership analysis of harm-doing in the Milgram paradigm. British Journal of Social Psychology, 57, 292–300.
Haslam, S. A., Reicher, S. D., & Birney, M. E. (2016). Questioning authority: New perspectives on Milgram’s ‘obedience’ research and its implications for intergroup relations. Current Opinion in Psychology, 11, 6–9.
Haslam, S. A., Reicher, S. D., Birney, M. E., Millard, K., & McDonald, R. (2015). ‘Happy to have been of service’: The Yale archive as a window into the engaged followership of participants in Milgram’s ‘obedience’ experiment. British Journal of Social Psychology, 54, 55–83.
Kaplan, D. E. (1996). The Stanley Milgram papers: A case study on appraisal of and access to confidential data files. American Archivist, 59, 288–297.
Kaposi, D. (2022). The second wave of critical engagement with Stanley Milgram’s ‘obedience to authority’experiments: What did we learn?. Social and Personality Psychology Compass, 16(6), e12667.
Kilham, W., & Mann, L. (1974). Level of destructive obedience as a function of transmitter and executant roles in the Milgram obedience paradigm. Journal of Personality and Social Psychology, 29(5), 696–702.
Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal and Social Psychology, 67, 371-378.
Milgram, S. (1964). Issues in the study of obedience: A reply to Baumrind. American Psychologist,
19, 848–852.
Milgram, S. (1965). Some conditions of obedience and disobedience to authority. Human Relations, 18(1), 57-76.
Milgram, S. (1974). Obedience to authority: An experimental view. Harpercollins.
Miller, A. G. (2009). Reflections on” Replicating Milgram”(Burger, 2009), American Psychologist, 64(1):20-27
Nicholson, I. (2011). “Torture at Yale”: Experimental subjects, laboratory torment and the “rehabilitation” of Milgram’s “obedience to authority”. Theory & Psychology, 21, 737–761.
Nicholson, I. (2015). The normalization of torment: Producing and managing anguish in Milgram’s “obedience” laboratory. Theory & Psychology, 25, 639–656.
Orne, M. T., & Holland, C. H. (1968). On the ecological validity of laboratory deceptions. International Journal of Psychiatry, 6(4), 282-293.
Orne, M. T., & Holland, C. C. (1968). Some conditions of obedience and disobedience to authority.
On the ecological validity of laboratory deceptions. International Journal of Psychiatry, 6, 282–293.
Perry, G. (2013). Behind the shock machine: The untold story of the notorious Milgram psychology
experiments. New York, NY: The New Press.
Reicher, S., Haslam, A., & Miller, A. (Eds.). (2014). Milgram at 50: Exploring the enduring relevance of psychology’s most famous studies [Special issue]. Journal of Social Issues, 70(3), 393–602
Russell, N. (2014). Stanley Milgram’s obedience to authority “relationship condition”: Some
methodological and theoretical implications. Social Sciences, 3, 194–214
Shanab, M. E., & Yahya, K. A. (1978). A cross-cultural study of obedience. Bulletin of the Psychonomic Society.
Smith, P. B., & Bond, M. H. (1998). Social psychology across cultures (2nd Edition). Prentice Hall.
Waller, J. (2002). Becoming Evil: How Ordinary People Commit Genocide and Mass Killing. Oxford University Press.
Further Reading
Learning Check
Which is true regarding the Milgram obedience study?
-
- The aim was to see how obedient people would be in a situation where following orders would mean causing harm to another person.
- Participants were under the impression they were part of a learning and memory experiment.
- The “learners” in the study were actual participants who volunteered to be shocked as part of the experiment.
- The “learner” was an actor who was in on the experiment and never actually received any real shocks.
- Although the participant could not see the “learner”, he was able to hear him clearly through the wall
- The study was directly influenced by Milgram’s observations of obedience patterns in post-war Europe.
- The experiment was designed to understand the psychological mechanisms behind war crimes committed during World War II.
- The Milgram study was universally accepted in the psychological community, and no ethical concerns were raised about its methodology.
- When Milgram’s experiment was repeated in a rundown office building in Bridgeport, the percentage of the participants who fully complied with the commands of the experimenter remained unchanged.
- The experimenter (authority figure) delivered verbal prods to encourage the teacher to continue, such as ‘Please continue’ or ‘Please go on’.
- Over 80% of participants went on to deliver the maximum level of shock.
- Milgram sent participants questionnaires after the study to assess the effects and found that most felt no remorse or guilt, so it was ethical.
- The aftermath of the study led to stricter ethical guidelines in psychological research.
- The study emphasized the role of situational factors over personality traits in determining obedience.
Answers: Items 3, 8, 9, and 11 are the false statements.
Short Answer Questions
-
- Briefly explain the results of the original Milgram experiments. What did these results prove?
- List one scenario on how an authority figure can abuse obedience principles.
- List one scenario on how an individual could use these principles to defend their fellow peers.
- In a hospital, you are very likely to obey a nurse. However, if you meet her outside the hospital, for example in a shop, you are much less likely to obey. Using your knowledge of how people resist pressure to obey, explain why you are less likely to obey the nurse outside the hospital.
- Describe the shock instructions the participant (teacher) was told to follow when the victim (learner) gave an incorrect answer.
- State the lowest voltage shock that was labeled on the shock generator.
- What would likely happen if Milgram’s experiment included a condition in which the participant (teacher) had to give a high-level electric shock for the first wrong answer?
Group Activity
Gather in groups of three or four to discuss answers to the short answer questions above.
- For question 2, review the different scenarios you each came up with. Then brainstorm on how these situations could be flipped.
- For question 2, discuss how an authority figure could instead empower those below them in the examples your groupmates provide.
- For question 3, discuss how a peer could do harm by using the obedience principles in the scenarios your groupmates provide.
Essay Topic
-
- What’s the most important lesson of Milgram’s Obedience Experiments? Fully explain and defend your answer.
- Milgram selectively edited his film of the obedience experiments to emphasize obedient behavior and minimize footage of disobedience. What are the ethical implications of a researcher selectively presenting findings in a way that fits their expected conclusions?