As with our last blog post, we write to recommend that those readers who teach ethics check out a new textbook. Neil Malhotra and Ken Shotts of the Stanford Graduate School of Business have written the relatively brief (149 pages of text) Leading with Values: Strategies for Making Ethical Decisions in Business and Life (Cambridge University Press, 2022).
This is a fine new book that we at Ethics Unwrapped particularly like because we, of course, emphasize behavioral ethics—the psychology of moral decision making. While the second half of Leading with Values focuses on normative ethics (the study of what makes an action ethical or unethical), the first half emphasizes descriptive ethics, a term that largely overlaps with behavioral ethics because it studies how people actually make ethical choices, which is largely a psychological process.
We often refer to behavioral ethics as the study of why good people do bad things, and Mahlotra and Shotts similarly emphasize that “unethical behavior is often not due to the malice of evil people.” Indeed, in Chapters 2-4, the authors emphasize that ethical decision-making is often the result of psychological factors that we often feature in our Ethics Unwrapped videos. Because their discussion is contained in three fairly brief chapters, it is hardly exhaustive but serves as a partial introduction to behavioral ethics.
Chapter 2, “Follow Your Gut?” emphasizes the impact, for good or ill, that people’s intuitions often have on their moral decision-making. They emphasize Nobel Prize winner Daniel Kahneman’s point that our brains’ “System 1” automatically and intuitively makes most of our decisions (including the moral ones), and that “System 2’s” more cognitive approach is less seldom used. The cognitive shortcuts (“heuristics”) that System 1 takes can lead to poor decisions, though they often get it right. Malhotra and Shotts speak of moral intuitions, which we emphasize are often guided by our moral emotions, such as guilt, shame, anger, disgust, compassion, and empathy.
In Chapter 3, “Self-Deception and Rationalization,” Malhotra and Shotts emphasize, as we often do, that doing bad things while thinking of ourselves as good people creates cognitive dissonance. The recommended way to resolve that dissonance is to stop doing bad things, but instead, our minds often find ways to convince ourselves that we can do the bad things and still be good folks—a core lesson of behavioral ethics.
The authors introduce the term ethical defensiveness, which seems roughly comparable to Albert Bandura’s notion of moral disengagement. Both focus on all the ways that good people can err morally. In fleshing out their conception of ethical defensiveness, the authors touch on several key concepts of behavioral ethics:
- The overconfidence bias is the fact that all of us, even convicted felons, tend to think of ourselves as more moral than those around us. This overconfidence can lead us to make moral missteps not because we intend to but because we are not sufficiently thoughtful and reflective when making important moral choices.
- The in-group/out-group bias is the tendency people have to favor people in their in-group and disfavor people in an out-group, including by judging the former group’s actions as more moral than comparable actions by out-group members.
- The confirmation bias is the tendency we have to seek out information that supports views we already hold or that are favorable to us, which can lead us to draw unfounded conclusions about the morality of our actions.
- Making advantageous comparisons is a form of rationalization. Rationalizations are a key method humans use to give themselves permission not to live up to their own moral standards, and making advantageous comparisons is one form of rationalization. Experts Vikas Anand and colleagues refer to this as “selective social comparison”. When you find yourself thinking “I know I shouldn’t do this, but others do things that are even worse,” you are engaging in selective social comparison.
- Moral credentialing is another term for moral licensing. When people have done something good, they give themselves a pat on the back (which the authors term a moral credential) and feel that they have a surplus on their mental moral scoreboard on which they compare their actual behavior with their vision of themselves as a good person. With such a surplus in hand, they may give themselves permission to, just this once, not live up to their normal moral standards.
In Chapter 4, “The Power of the Situation,” Malhotra and Shotts sensibly introduce readers to the fact that people’s moral decisions and actions are powerfully affected by situational factors such as their desire to please authority figures and to conform their behavior to those around them.
Importantly, they emphasize the fundamental attribution error, the tendency we have to assume that other people do bad things because they are bad people, but that we do bad things because of the situation we find ourselves in. We conclude that we’re not bad people because although we are doing bad things, we have no choice in the matter because, for example, the boss made us do it.
Although the book’s introduction to behavioral ethics is a bit haphazard, the discussion in this book is interesting and helpful. The authors load the book with discussions of ethical challenges where the companies mostly got it right (Intel and conflict minerals) and where things went massively wrong (Andrew Wakefield’s fraudulent research linking the MMR vaccine to autism, the University of North Carolina cheating scandal, Theranos, Varsity Blues, and the LIBOR manipulation scandal). They also describe activities they do in class (often surveys of student opinions) that show the students’ decisions to embody the aspects of behavioral ethics that the authors are teaching.
This is a good book that we can recommend without even mentioning its second half which focuses on normative ethics.
Vikas Anand et al., “Business as Usual: The Acceptance and Perpetuation of Corruption in Organizations,” Academy of Management Perspectives 18(2): 39-53 (2004).
Albert Bandura, Moral Disengagement: How People Do Harm and Live with Themselves (2016).
Cara Biasucci & Robert Prentice, Behavioral Ethics in Practice: Why We Sometimes Make the Wrong Decisions (2022).
John Carreyrou, Bad Blood: Secrets and Lies in a Silicon Valley Startup (2018).
Brian Deer, The Doctor Who Fooled the World: Science, Deception, and the War on Vaccine (2020).
Daniel Kahneman, Thinking, Fast and Slow (2011).
Neil Mahotra & Ken Shotts, Leading with Values: Strategies for Making Ethical Decisions in Business and Life (2022)
Behavioral Ethics Introduction: https://ethicsunwrapped.utexas.edu/video/intro-to-behavioral-ethics
Cognitive Dissonance: https://ethicsunwrapped.utexas.edu/video/cognitive-dissonance
Confirmation Bias: https://ethicsunwrapped.utexas.edu/glossary/confirmation-bias
Conformity Bias: https://ethicsunwrapped.utexas.edu/video/conformity-bias
Fundamental Attribution Error: https://ethicsunwrapped.utexas.edu/glossary/fundamental-attribution-error
In-group/Out-Group Bias: https://ethicsunwrapped.utexas.edu/glossary/in-groupout-group
Moral Emotions: https://ethicsunwrapped.utexas.edu/video/moral-emotions
Moral Equilibrium: https://ethicsunwrapped.utexas.edu/video/moral-equilibrium
Obedience to Authority: https://ethicsunwrapped.utexas.edu/video/obedience-to-authority
Overconfidence Bias: https://ethicsunwrapped.utexas.edu/video/overconfidence-bias