Obedience to Authority
Obedience to authority describes our tendency to please authority figures. We may place too much emphasis on that goal and, consciously or subconsciously, subordinate the goal of acting ethically.
1. Does the claim that an excessive desire to please authority may cause people to act unethically ring true to you?
2. Can you think of a situation where you deferred to authority and later regretted it? Perhaps because you facilitated a stupid decision that you could have stopped? Perhaps because you facilitated an unethical decision that you could have stopped?
3. Which is scarier—that Joe might not have the courage to stand up to a superior requesting unethical action because Joe doesn’t want to lose this job, or that Joe might not even see the ethical issue because he is so intent upon pleasing the boss?
4. Does Bud Krogh’s explanation for how he went off the ethical rails sound plausible to you?
5. How can people guard against suspending their own ethical judgment in order to unduly defer to authority?
6. Following is a description from Prof. Jesse Prinz of Stanley Milgram’s famous experiment on obedience to authority. Read the description and then tell the class how you think that you would have acted had you been one of the subjects of the experiment.
“Subjects in this experiment were instructed to ask another volunteer, located in an adjacent room, a series of questions. Each time the second volunteer failed to answer a question correctly, the subject asking the questions was asked to administer an electric shock using a dial with increasing voltages. Unbeknownst to the subject the second volunteer was really a stooge working with the experimenter, and the voltage dial was a harmless prop. The stooges were instructed to make errors so that the subjects would have to administer shocks. At preplanned stages, the stooges would express pain, voice concerns about safety, make sounds of agony, pound on the wall, or, ultimately, stop making any noise at all. If a subject conveyed reluctance to continue increasing the voltage, the experimenter would reply that it was crucial for the experiment to continue. The experiment ended if and when a subject persistently refused to continue.”
Franz Stangl was born in Austria in 1908. From a working class family, Stangl trained as a master weaver. Unsatisfied in his career, at the age of 23, he applied to become a police officer. In 1936, despite his position in law enforcement, he joined the ranks of the then-illegal Nazi Party. When Germany invaded Austria, and subsequently annexed it in March 1938, he became a Gestapo agent. In 1940, under the order of Nazi leaders, Stangl was appointed as head of security at Hartheim Castle. At the time, Hartheim was one of the secret killing centers used by the authorities to administer “mercy deaths” to sick and disabled persons. A special unit within the German administration, codenamed T4, carried out this so-called “euthanasia” program. T4 employed doctors, nurses, lawyers, and police officers, among others, at killing centers in Germany and Austria. In all, historians estimate that the staff at Hartheim killed 18,269 people by August 1941.
After a brief stint in Berlin, Stangl transferred to German-occupied Poland in the spring of 1942. Nazi authorities appointed Stangl to be the first commandant of the killing center at Sobibór. By September 1942, having distinguished himself as an effective organizer, Stangl was transferred to what would become the most horrible of these death camps, Treblinka. While there, he managed and perfected a system of mass murder, using psychological techniques to first deceive then terrify and subdue his victims before they entered the gas chambers. In less than 18 months, under Stangl’s supervision, between 870,000 and 925,000 Jews were killed at Treblinka.
After the war, Franz Stangl and his family emigrated to Brazil where he lived and worked under his own name for decades. He was extradited to West Germany in 1967 and tried for his role in the murder of 900,000 men, women, and children during the Holocaust. During his trial, Stangl claimed that he was doing his duty and was not a murderer. Stangl defended himself by making three main claims. First, that he did not get to choose his postings, and that disobeying an order would put himself and his family at risk. Second, that once in a position, it was his nature to do an excellent job (he became known as the best commandant in Poland). And third, that he never personally murdered anyone. He saw himself as an administrator. Stangl claimed that his dedication to his work was not about ideology or hatred of Jews.
On October 22, 1970, the court found Stangl guilty of crimes against humanity and sentenced him to the maximum penalty, life in prison. During an interview while in prison, he stated, “My conscience is clear about what I did, myself. …I have never intentionally hurt anyone, myself. …But I was there. …So yes, in reality I share the guilt.” He continued, “My guilt…is that I am still here. That is my guilt.” On June 28, 1971, less than a day after this interview, Stangl died of heart failure in prison.
1. How did obedience to authority affect Franz Stangl’s perception of his responsibility? Explain. What other factors, biases, or pressures may have affected his perception?
2. Based on Stangl’s description of guilt while in prison, do you think he believed his previous claims in court? Why or why not?
3. What might have helped Stangl at the time to see his actions for what they were? Do you think this would have led Stangl to act differently? Why or why not?
4. Can you think of other historical examples in which obedience to authority may have played a significant role in the actions of individuals? Explain.
5. What do you think the moral responsibility of an individual is within a bureaucracy? Explain.
6. Does one’s position in a hierarchy affect one’s moral responsibility? Why or why not?
Into that Darkness: An Examination of Conscience
Som Significant Cases: Franz Stangl – Simon Wiesenthal Archiv
The Roots of Evil
The Holocaust and the Revival of Psychological History
This video introduces the behavioral ethics bias known as obedience to authority. Obedience to authority describes our tendency to please authority figures. We may place too much emphasis on that goal and, consciously or subconsciously, subordinate the goal of acting ethically. We all need to monitor ourselves to ensure that we are not unduly suspending our own independent ethical judgment in order to please our superiors. If students are not aware of this vulnerability, they cannot guard against it. Many white-collar criminals trace their downfall to an excessive obedience to authority. Many successful students are “pleasers,” so they can understand how strong the motive to please authority can be.
The “Milgram experiment” offers a glimpse into the effects of obedience to authority. Psychologist Stanley Milgram studied whether Americans might be as obedient to authority as Germans seemed to be under Hitler. The question addressed was whether subjects would deliver apparently painful electric shocks to another person who had missed a question in an apparent test of whether negative reinforcement through electric shocks would improve memory, just because someone in a white lab coat told them to do so. Although people predicted before the experiment that very few American subjects would show excessive obedience to authority, in actuality, as Professor Francesca Gino writes:
“All of Milgram’s participants—who were well-adjusted, well-intentioned people—delivered electric shocks to victims who seemingly were in great pain, complaining of heart problems, or even apparently unconscious. Over 60 percent of participants delivered the maximum shock.”
Perhaps this should not have been too surprising. The pleasure centers of our brains light up when we please authority. We are trained from childhood to please authority figures—parents, teachers, and police officers.
Law and order are generally good things, so some level of obedience to authority is definitely a good thing. But if people go too far and suspect their own independent ethical judgment, either consciously or unconsciously, they are dropping the ball.
Employers, we argue, pay employees for their brains, their education and training, and their judgment. Employers are short-changed if employees do not use their best strategic judgment, their best operational judgment, and their best moral judgment, because errors in any of the three areas can be quite costly.
The case study on this page, “Stangl & the Holocaust,” explores an extreme example of obedience to authority, in which Nazi officer Franz Stangl, who was responsible for the killing of nearly one million Jews, claimed he was simply following orders. For a related case study that examines the dangers of conformity bias during the Holocaust, read “Reserve Police Battalion 101.”
Terms defined in our ethics glossary that are related to the video and case studies include: conformity bias, obedience to authority, and role morality.
Behavioral ethics draws upon behavioral psychology, cognitive science, evolutionary biology, and related disciplines to determine how and why people make the ethical and unethical decisions that they do. Much behavioral ethics research addresses the question of why good people do bad things. Many behavioral ethics concepts are explored in detail in Concepts Unwrapped, as well as in the video case study In It to Win: The Jack Abramoff Story. Anyone who watches all (or even a good part) of these videos will have a solid introduction to behavioral ethics.
Ariely, Dan. 2012. The (Honest) Truth About Dishonesty: How We Lie to Everyone—Especially Ourselves. New York: HarperCollins Publishers.
Bazerman, Max H., and Ann E. Tenbrunsel. 2011. Blind Spots: Why We Fail to Do What’s Right and What to Do about It. Princeton, NJ: Princeton University Press.
De Cremer, David (Editor). 2009. Psychological Perspectives on Ethical Behavior and Decision Making. Charlotte, NC: Information Age Publishing.
De Cremer, David, and Ann E. Tenbrunsel (Editors). 2012. Behavioral Business Ethics: Shaping an Emerging Field. New York: Routledge.
DeSteno, David, and Piercarlo Valdesolo. 2011. Out of Character: The Surprising Truths about the Liar, Cheat, Sinner (and Saint) Lurking in All of Us. New York: Crown Publishers.
Dienhart, John William, Dennis J. Moberg, and Ronald F. Duska (Editors). 2001. The Next Phase of Business Ethics: Integrating Psychology and Ethics. Bingley, UK: Emerald Group Publishing.
Gino, Francesca. 2013. Sidetracked: Why Our Decisions Get Derailed, and How We Can Stick to the Plan. Boston: Harvard Business Review Press.
Heffernan, Margaret. 2011. Willful Blindness: Why We Ignore the Obvious at Our Peril. New York: Walker Publishing Company.
Matousek, Mark. 2012. Ethical Wisdom: The Search for a Moral Life. New York: Anchor Books.
Mayhew, Brian W., and Pamela R. Murphy. 2014. “The Impact of Authority on Reporting Behavior, Rationalization and Affect.” Contemporary Accounting Research 31 (2): 420-443.
Messick, David M., and Ann E. Tenbrunsel (Editors). 1996. Codes of Conduct: Behavioral Research into Business Ethics. New York: Russell Sage Foundation.
Milgram, Stanley. 1974. Obedience to Authority: An Experimental View. New York: Harper & Row.
Rhode, Deborah L. (Editor). 2006. Moral Leadership: The Theory and Practice of Power, Judgment, and Policy. San Francisco, CA: Jossey-Bass.
Werhane, Patricia H., Laura Pincus Hartman, Crina Archer, Elaine E. Englehardt, and Michael S. Pritchard. 2013. Obstacles to Ethical Decision-Making: Mental Modes, Milgram and the Problem of Obedience. Cambridge, UK: Cambridge University Press.
For resources on teaching behavioral ethics, an article written by Ethics Unwrapped authors Minette Drumwright, Robert Prentice, and Cara Biasucci introduces key concepts in behavioral ethics and approaches to effective ethics instruction—including sample classroom assignments. The article, published in the Decision Sciences Journal of Innovative Education, may be downloaded here: “Behavioral Ethics and Teaching Ethical Decision Making.”
A detailed article by Robert Prentice with extensive resources for teaching behavioral ethics, published in Journal of Legal Studies Education, may be downloaded here: “Teaching Behavioral Ethics.”
An article by Robert Prentice discussing how behavioral ethics can improve the ethicality of human decision-making, published in the Notre Dame Journal of Law, Ethics & Public Policy, may be downloaded here: “Behavioral Ethics: Can It Help Lawyers (And Others) Be their Best Selves?”
A dated but still serviceable introductory article about teaching behavioral ethics can be accessed through Google Scholar by searching: Prentice, Robert A. 2004. “Teaching Ethics, Heuristics, and Biases.” Journal of Business Ethics Education 1 (1): 57-74.
Transcript of Narration
Written and Narrated by
Robert Prentice, J.D.
Business, Government & Society Department
McCombs School of Business
The University of Texas at Austin
“When we are young, we naturally wish to please our parents, our teachers, and ministers and rabbis. Even as adults, we desire to please authority figures, such as our boss at work. If obedience to authority causes us to ignore our own ethical standards, however, big trouble can result.
When people in organizations make decisions, they are often much more concerned about the acceptability of the decision to the people to whom they are accountable than they are about the content of the decision itself. Studies show that an underling who is pressured by a CFO to cook the books is much more likely to act improperly than an employee who is not so pressured. Another recent study showed that CFOs are more likely to illicitly manage earnings when it profits their CEOs than when it profits themselves. In other words, they act unethically primarily to please their bosses, not to put money in their own pockets.
A study of nurses by Hofling and Brotzman found that when members of one group were asked whether they would follow a physician’s instructions to give a patient an injection of an obviously excessive dose of a drug that was not even on the hospital’s approved list, almost all the nurses said that they would not do so. But when a second group of nurses were actually given such instructions, virtually every one of them was prepared to do so before they were stopped by the experimenters. Most of us do not realize how much our desire to please our superiors and our consequent tendency to defer to authority will cloud our ethical judgment when the time comes to make decisions.
Private e-mails sent by stock analysts during the dot.com boom often indicated that the analysts wished they had the courage to stand up to their superiors and “call them like they saw them.” But usually these analysts failed to do so. Instead, they continued to knuckle under to supervisory pressure to hype questionable stocks so that their firms could gain investment banking business.
More concerning than people consciously acting unethically in order to stay in their boss’s good graces is the fact that sometimes employees are so intent upon pleasing their superiors that they do not even notice the ethical aspects of a decision. Egil “Bud” Krogh, who became infamous as head of the “Plumbers Unit” operating out of President Nixon’s White House, was instructed to oversee a break-in at the office of the psychiatrist of Daniel Ellsberg, who had leaked the Pentagon Papers to the New York Times, embarrassing the Nixon Administration. Krogh later explained that he was so intent upon pleasing his superiors who were, after all, among the most powerful people in the world, that he never even activated his own ethical sense to judge the morality of what he was trying to accomplish. He did not see the ethical dimensions of his situation until it was too late.
Bud Krogh’s experience should be a warning to us all. While it is usually a fine thing for us to please our superiors, we must keep a lookout for ethical issues and we must never defer so completely to our bosses that we check our own ethical standards at the door.”