Skip to main content

Moral Emotions

Moral emotions are the feelings and intuitions that play a major role in most of our ethical decision making and actions.

Discussion Questions

1. Not all scientists agree that emotions play as large a role in moral judgments as is painted in this video.  What do you think?  Why?

2. Can you think of a time when you were a victim of “moral dumbfounding”—you jumped to a moral conclusion that you could not logically defend?  Explain?

3. Which of the self-conscious emotions–guilt, shame, or embarrassment—do you think is the most important?  Why?

4. How would you describe the difference between sympathy and empathy?

5. Can you think of a scenario in which the disgust emotion has led someone astray when they made a moral judgment?  Feel free to include yourself.

6. Many people believe that empathy is the most influential of all the moral emotions.  If it is, which emotion would you believe is the second most influential?

Wells Fargo and Moral Emotions

Wells Fargo and Moral Emotions

In a settlement with regulators, Wells Fargo Bank admitted that it had created as many as two million accounts for customers without their permission.

View

Teaching Notes

The Moral Emotions video in our Concepts Unwrapped series is an important one because people make most of their moral judgments and action decisions intuitively (System 1) rather than following great cognitive effort (System 2).

It seems to us that we make our moral decisions rationally, so it can be very difficult for people to grasp the true reality. One way to make a little progress in convincing people of the role that emotions play in their decision-making is to explore the notion of moral dumbfounding. The work of Joshua Greene (Moral Tribes 2013), Jonathan Haidt (The Righteous Mind 2012), and others helps.

One approach is present students with factual scenarios which will trigger their disgust emotion, but do not involve harm to any victims. The triggering of their disgust emotion will tend to lead them to conclude that the action is immoral, but they will falter when asked to come up with a logical reason why that is the case. Here are two unpleasant but effective examples borrowed from the work (and vivid imagination) of others:

  • Tom, a 16-year-old, was left home alone by his parents as they visited relatives out of town. He went to the local grocery store, bought some lotion, and took it home and masturbated with it. Did Tom act immorally?
  • Rex and Sarah were brother and sister, both in their late 20s. They had always been close. One evening after they watched a movie in Rex’s apartment, they decided to have sexual relations, reasoning that it would make their relationship even closer and more special. They took all necessary precautions. They never chose to have sex again. Did they act immorally?

Another way to illustrate moral dumbfounding is to ask half the class their opinion on one of the following “trolleyology” scenarios and the other half to opine on the other.

Typically, they will give very different answers even though the big picture result (killing one person in order to save five) is the same. The students will have great difficulty explaining rationally why most people say it is moral for Denise to act, but most will say it is not moral for Frank to do so. The best explanation is, indeed, an emotional one.

  • Denise is standing beside a switching lever near tracks when she sees an out-of-control trolley. The conductor has fainted and the trolley is headed toward five people walking on the track; the banks are so deep that they will not be able to get off the track in time. The track has a side track leading off to the left, and Denise can flip the switch and turn the trolley on to it. There is, however, one person on the left-hand track. Denise can turn the trolley, killing the one; or she can refrain from flipping the switch, letting the five die. Is it morally permissible for Denise to flip the switch, turning the trolley onto the side track?
  • Frank is on a footbridge over trolley tracks. He knows trolleys and can see that the one approaching the bridge is out of control, with its conductor passed out. On the track under the bridge, there are five people; the banks are so steep that they will not be able to get off the track in time. Frank knows that the only way to stop an out-of-control trolley is to drop a very heavy weight into its path. But the only available, sufficiently heavy weight is a large person also watching the trolley from the footbridge. Frank can shove the large person onto the track in the path of the trolley, resulting in his death; or he can refrain from doing this, letting the five die. Is it morally permissible for Frank to push the large person onto the tracks?

The field of “trolleyology” has gotten a little crazy, but Thomas Cathcart’s The Trolley Problem (2013) and David Edmonds’ Would You Kill the Fat Man? (2014) are two helpful and accessible books on the topic.

Additional Resources

The latest resource from Ethics Unwrapped is a book, Behavioral Ethics in Practice: Why We Sometimes Make the Wrong Decisions, written by Cara Biasucci and Robert Prentice. This accessible book is amply footnoted with behavioral ethics studies and associated research. It also includes suggestions at the end of each chapter for related Ethics Unwrapped videos and case studies. Some instructors use this resource to educate themselves, while others use it in lieu of (or in addition to) a textbook.

Cara Biasucci also recently wrote a chapter on integrating Ethics Unwrapped in higher education, which can be found in the latest edition of Teaching Ethics: Instructional Models, Methods and Modalities for University Studies. The chapter includes examples of how Ethics Unwrapped is used at various universities.

The most recent article written by Cara Biasucci and Robert Prentice describes the basics of behavioral ethics and introduces Ethics Unwrapped videos and supporting materials along with teaching examples. It also includes data on the efficacy of Ethics Unwrapped for improving ethics pedagogy across disciplines. Published in Journal of Business Law and Ethics Pedagogy (Vol. 1, August 2018), it can be downloaded here: “Teaching Behavioral Ethics (Using “Ethics Unwrapped” Videos and Educational Materials).”

An article written by Ethics Unwrapped authors Minette Drumwright, Robert Prentice, and Cara Biasucci introduce key concepts in behavioral ethics and approaches to effective ethics instruction—including sample classroom assignments. Published in the Decision Sciences Journal of Innovative Education, it can be downloaded here: “Behavioral Ethics and Teaching Ethical Decision Making.”

A detailed article written by Robert Prentice, with extensive resources for teaching behavioral ethics, was published in Journal of Legal Studies Education and can be downloaded here: “Teaching Behavioral Ethics.”

Another article by Robert Prentice, discussing how behavioral ethics can improve the ethicality of human decision-making, was published in the Notre Dame Journal of Law, Ethics & Public Policy. It can be downloaded here: “Behavioral Ethics: Can It Help Lawyers (And Others) Be their Best Selves?

A dated (but still serviceable) introductory article about teaching behavioral ethics can be accessed through Google Scholar by searching: Prentice, Robert A. 2004. “Teaching Ethics, Heuristics, and Biases.” Journal of Business Ethics Education 1 (1): 57-74.

Transcript of Narration

Written and Narrated by

Robert Prentice, J.D.
Business, Government & Society Department
McCombs School of Business
The University of Texas at Austin

It seems to us that our moral judgments, such as “It was wrong for Paul to cheat on his wife,” and our moral action (decisions), such as “I am going to help that homeless person,” are based on reason. However, most of our moral judgments are actually based on emotions or even mere intuitions.  When we feel that we are reasoning to a moral conclusion, often all we are doing is rationalizing a judgment or decision that our brains have already made instinctively.

Now, this shouldn’t be surprising.  Some 90% of all of our brain’s decisions are made automatically and intuitively. Why should moral decisions be any different?  Many scientists believe that emotions have evolved in part to encourage us to obey society’s moral rules so that we can effectively live together in groups.

For example, self-conscious emotions such as guilt, shame, and embarrassment motivate people to follow society’s moral norms.  Studies show that people with the most acute sense of guilt tend to be among the most moral and cooperative citizens.

People are also motivated to do the right thing because they know that they would face other-condemning emotions such as contempt, anger, and disgust if they did not do so. For example, when Paul’s friends learn that he cheated on his wife, they will likely feel anger and he will feel shame.  His friends may punish him for this wrong.
Other-praising emotions such as gratitude and moral elevation, which people sometimes feel when they see others do the right thing, can stimulate people to act prosocially.  Studies show that people will be more generous and helpful themselves after watching others be generous and helpful.

There are other-suffering emotions, also, such as sympathy, compassion, and empathy. These emotions often encourage people to help others in need.  Some experts believe that empathy is the most important moral emotion.  Primatologist Frans De Waal writes that “human morality is firmly anchored in the social emotions, with empathy at its core.”

Professor Godsey, co-founder of the Ohio Innocence Project, argues that racism in any form is a type of dehumanization. People are often capable of dehumanizing others, concluding that they are not deserving of moral treatment. For example, colonial Americans dehumanized Africans during slavery, and the Nazis dehumanized Jews during WWII. But we can thwart dehumanization with empathy. By consciously taking the perspective of others, we recognize their humanity, and can change our behavior.
So, moral emotions generally point people toward doing the right thing and away from doing the wrong thing, but remember these caveats:

First: our emotions are far from infallible.  For example, the emotion of disgust often causes us to condemn the thing that disgusts us in moral terms.  But there may be no rational moral basis to do so.  If we make a moral judgment emotionally, often we cannot rationally defend our choice, which is a concept called “moral dumbfounding.”

Second: although moral emotions urge us in the right direction, we often use rationalizations to deceive ourselves. We often overcome our potential guilt, shame and embarrassment and manage to do the wrong thing anyway, like Paul did when he cheated on his wife. We use psychological tricks to be able to view our immoral acts as not so bad after all.

Third, and last: our emotional reactions tend to beat our logical thoughts to the punch. Practicing mindfulness can improve our response. With diligence and practice, we can at least sometimes override our automatic emotional judgments with thoughtful cognitive calculation.

Shares