It seems obvious that people should judge the ethicality of others’ actions in an objective and fair way. What is not so obvious is how difficult it often is to do that. One reason why it is difficult to make such objective judgments is our tendency to sort ourselves and others into groups and to judge others differently and to act differently toward them depending upon whether we view them as part of our in-group or part of an out-group.
In our Ethics Unwrapped videos we highlight the conformity bias, which is the tendency of people to take their cues as to the proper way to think and act from those around them. We take cues about how to behave from members of our in-group, but often reject the cues coming from out-group members. Francesca Gino and her colleagues ran an experiment that gave student subjects an ability-based task to perform under time pressure. In one version of the experiment, the students had the opportunity to self-report their results in ways they thought could not be checked. They received modest rewards for reporting better performance and this led to a significant amount of cheating. When, in another iteration of the experiment, a member of the students’ in-group (a confederate of the experimenters wearing a sweatshirt from their university) obviously cheated without consequence, cheating went up. However, when a member of the students’ out-group (a confederate of the experimenters wearing a sweatshirt from a rival university) cheated, cheating among the subjects went down.
A recent study indicates that the in-group/out-group distinction also affects how we judge others’ actions. Wright, Dinsmore & Kellaris manipulated a scenario wherein someone sold to consumers a credit card carrying unfair terms. Subjects were asked to judge the ethicality of the action. Now there is already substantial experimental evidence that people tend to judge more harshly an out-group member who does a questionable act as compared to an in-group member who does the same act. People are probably unaware that natural loyalties affect their judgments, but studies make clear that our ethical evaluations are affected by in-group/out-group differences.
But Wright, Dinsmore & Kellaris were interested in knowing whether the identity of the victim would also have an impact upon ethical judgments. They learned that in-group members who scammed other in-group members were judged more harshly than in-group members who scammed out-group members. These in-group members had not only acted unethically, but had also violated the implicit duty of loyalty that we all tend to feel people owe to other members of their in-group. In the words of the authors: “When judging the ethicality of an act, it appears that people look beyond the nature of the act itself and implicitly consider their relationship to the parties involved: who is performing the act and who is being victimized.”
Perhaps most interesting of all, when subjects were asked what penalties they would impose upon people who sold this unfair product, their loyalty to the in-group reasserted itself and the subjects preferred a more lenient punishment for the in-group member despite the relative greater nature of the transgression. According to the authors: “These results demonstrate that although participants judged in-group marketers transgressing against their in-group to be the most unethical amongst all conditions, the in-group bias reduced the harshness of the preferred punishment.”
Among the most interesting findings in this entire line of research is how little it takes for us to view someone as part of our in-group, or of an out-group. Subjects have demonstrated obvious in-group prejudice toward people after being told (usually falsely) that they shared a birthday month with the other person, liked the same painting, or had made a similar mistake on a quiz as the other person. Gino writes: “Psychological closeness creates distance from one’s own moral compass, causing people to view unethical behavior as more morally appropriate and less wrong.”
In August, we will release a “Concepts Unwrapped” video highlighting the dangerous tendency people have to be overly obedient to authority, which may involve suspending their own independent ethical judgment. Studies show that if people perceive the authority figure as part of their in-group, their tendency to be unduly obedient is exacerbated.
Our tendency to use moral judgments to enforce our in-groups’ rules likely conferred an evolutionary advantage when we started living together in groups. Studies show that in-group affinities tend to have more impact on our ethical judgments than moral reasoning. Only if we are aware of these in-group/out-group differences and consciously guard against their influence can we hope to judge others’ actions more objectively and act more fairly toward them.
- Scott A. Wright, John B. Dinsmore & James J. Kellaris, “How Group Loyalties Shape Ethical Judgment and Punishment Preferences,” 30 Psychology and Marketing 203 (March 2013)
- FRANCESCA GINO, SIDETRACKED (2013).
- Francesca Gino, Shahar Ayal & Dan Ariely, “Contagion and Differentiation in Unethical Behavior: The Effect of One Bad Apple on the Barrel,” 20 Psychological Science 393 (2009).
- Francesca Gino & Adam Galinsky, “Vicarious Dishonesty: When Psychological Closeness Creates Distance from One’s Moral Compass,” 119 Organizational Behavior and Human Decision Processes 15 (2012).