One of America’s foremost legal scholars, Harvard Law School’s Cass Sunstein, has written a small, provocative book titled Liars: Falsehoods and Free Speech in an Age of Deception (2021). Because Sunstein is founder and director of the Program on Behavioral Economics and Public Policy at the Harvard Law School, currently serves as Chair of the WHO’s technical advisory group on Behavioural Insights and Sciences for Health, and was the “nudge minister” of the Obama Administration, it is not surprising that the book holds some interest for a website, like Ethics Unwrapped, that specializes in behavioral ethics.
The Supreme Court grants substantial legal protection to false statements. Sunstein believes that perhaps it grants too much protection, but certainly recognizes the dangers of allowing governments to punish speech that they claim is false. He urges a bigger role for private entities, such as Facebook (for which Sunstein has acted as a consultant on such issues) in regulating false speech.
The purpose of this blog post is not to analyze all of Sunstein’s legal and policy arguments, or even just to summarize them. Rather, we seek only to mine Liars for insights relevant to behavioral ethics.
For example, Chapter 3 (“Ethics”) addresses the ethicality of lying. Some lies are obviously immoral (especially if they are intentionally false and cause damage) while other lies are ethically mandatory (ex: the famous scenario where you are asked by Nazi soldiers if you are hiding a Jewish family).
Sunstein explores the two primary methods of discerning the ethicality of lying in a particular circumstance. First, there is utilitarianism (see our video here: https://ethicsunwrapped.utexas.edu/glossary/utilitarianism). Falsehoods often injure individuals in a variety of ways and telling lies generally undermines the level of trust in a society, which can cause a cascade of unfortunate effects. So, utilitarianism generally condemns lying, though the Nazi scenario is example where it would definitely support the practice.
Kantianism, or deontology (see our video here: https://ethicsunwrapped.utexas.edu/glossary/deontology) argues that lying is always wrong (or nearly so). Immanuel Kant wrote that “By a lie a man throws away and, as it were, annihilates his dignity as a man.” Kant said roughly the same thing about masturbation, which we suspect is not nearly as unethical as lying because it causes no harm.
Sunstein argues that our moral intuition, fueled by our moral emotions (see our video here: https://ethicsunwrapped.utexas.edu/video/moral-emotions) such as guilt and shame, generally takes a Kantian view of lying. These emotions lead most of us to tell the truth most of the time, thankfully. However, Sunstein believes, as do we, that utilitarianism is the better approach to deciding the ethicality of a particular lie. Fortunately, both approaches will generally lead to the same result—a conclusion that lying is usually wrong.
The book’s sixth chapter (“Falsehoods Fly”) contains the bulk of Sunstein’s behavioral insights, including:
- “Truth bias.” Most people most times believe what they are told by other people, leading to a “truth bias” that can make them particularly susceptible to lies. Indeed, people are more likely to misremember as true a statement that they have been explicitly told is false than misremember as false a statement they have been explicitly told is true.
- “Meta-cognitive myopia.” The idea here is that humans are more attuned to “primary information” (e.g., a post on Facebook says that a famous athlete is an adulterer) than to “meta-information” —whether or not that information is true. Sunstein cites a study by Pantazi and colleagues which asked subjects to decide an appropriate sentence for a defendant. Whether the subjects were laypeople or actual judges, they failed to adequately discount information that they were explicitly told was false. Sunstein emphasizes that “[n]egative information in particular puts a kind of stamp on the human mind, and removing it is not easy.”
- Sunstein also emphasizes a study by Vosoughi and co-authors of rumor “cascades” on Twitter. They found that “falsehood diffused significantly farther, faster, deeper, and more broadly than the truth in all categories of information.” Why? Perhaps because false statements tend to be more surprising, more unusual, and therefore more interesting. Whatever the reason, Sunstein warns: “If falsehoods are especially likely to spread, and if people are biased in the direction of thinking that what they hear is true, then the risk that people will believe falsehoods increases dramatically.”
- “Backfire effect.” Exacerbating all this is the “backfire effect,” the evidence that in some situations when you provide truthful information to counteract a rapidly-spreading lie, rather than successfully debunk the lie, the information backfires by causing the listener to believe the lie even more. A study by Nyhan and Reifler found that people who had a favorable view of Sarah Palin when she ran for vice-president and knew a lot about politics but were presented with information that Palin’s statements that the Obamacare plan created “death panels” were false, were more, not less, likely to believe her lie. Sunstein worries: “The general lesson is both straightforward and disturbing. People who know a great deal, and who distrust a particular messenger, might well be impervious to factual corrections, even if what they believe turns out to be false. The effect of corrections might be to fortify their initial beliefs.”
- “Conformity bias.” The conformity bias (see our video here: https://ethicsunwrapped.utexas.edu/video/conformity-bias ) is the tendency people have to take their cues as to how to act and what to believe from those around them, particularly members of their “in-group” (see our video on the in-group/out-group bias here: https://ethicsunwrapped.utexas.edu/glossary/in-groupout-group ). Relevant studies, such as Solomon Asch’s famous experiment with lines, show that people often believe or at least pretend to believe obviously false statements because of their desire to belong to and go along with their group, whether it be MAGA-fanatics or Antifa members. As Asch concluded several decades ago and we can readily observe in our current political discourse: “That we have found the tendency to conformity in our society so strong that reasonably intelligent and well-meaning young people are willing to call white black is a matter of concern.”
- “Social cascades.” The conformity bias’s impact can be exacerbated by social cascades that amplify the pressure to go along. Sam may believe a particular proposition primarily because her friend Ted believes it. Ted may believe it primarily because his friend Alfredo believes it. Alice, seeing that her friends Sam, Ted, and Alfredo all believe the proposition may well espouse the belief as well just because so many members of her in-group do so (though with little foundation). Sunstein warns: “At that stage, a large number of people eventually appear to support a certain belief or course of action simply because others (appear to) do so. Widespread support for falsehoods can arise in this way.”
- “Group polarization” occurs, says Sunstein, relying on numerous studies, “when group discussion leads group members to a more extreme point in line with their tendencies before they started to talk.” If a group of supporters of Trump’s Big Lie gather together in Washington D.C. on January 6 and start discussing the fanciful “Big Steal,” they are likely to whip each other into a frenzy and come away more convinced than ever that the biggest lie in our political world today is true.
If we are to be ethical people, we must stand for truth over falsehood. We must be aware of influences such as truth bias, the conformity bias, and group polarization and guard against them. We must do our best to ensure that the things we believe are based on facts and rational arguments rather than because they make us feel good and make it easy to fit in with our peers. The truth is sometimes inconvenient and even painful. It is still the truth.
Sources:
Dominic Abrams et al., “Knowing What to Think by Knowing Who You Are: Self-categorization and the Nature of Norm Formation, Conformity, and Group Polarization,” British Journal of Social Psychology, 29(2): 97-119 (1990).
Solomon Asch, “Opinions and Social Pressure,” in Readings About the Social Animal 13 (Elliott Aronson ed., 1995).
Sushil Bikhchandani et al., “Learning from the Behavior of Others,” Journal of Economic Perspectives, 12(3): 151-170 (1998).
David G. Myers, “Decision-Induced Attitude Polarization,” Human Relations, 28(8): 699-714 (1975).
Brendan Nyhan & Jason Reifler, “When Corrections Fail: The Persistence f Political Misperceptions,” Political Behavior, 32: 303-330 (2010).
Myrto Pantazi et al., “Is Justice Blind or Myopic? An Examination of the Effects of Meta-Cognitive Myopia and Truth Bias on Mock Jurors and Judges,” Judgment & Decision Making, 15(2): 214-229 (2020).
Cass Sunstein, Liars: Falsehoods and Free Speech in an Age of Deception (2021).
Soroush Vosoughi et al., “The Spread of True and False News Online,” Science, 359(6380): 1146-1151 (2018).
Videos:
Conformity Bias: https://ethicsunwrapped.utexas.edu/video/conformity-bias
Deontology: https://ethicsunwrapped.utexas.edu/glossary/deontology
In-group/Out-group Bias: https://ethicsunwrapped.utexas.edu/glossary/in-groupout-group
Moral Emotions: https://ethicsunwrapped.utexas.edu/video/moral-emotions
Utilitarianism: https://ethicsunwrapped.utexas.edu/glossary/utilitarianism