As frequent visitors to Ethics Unwrapped know, our primary focus is upon behavioral ethics—the science of moral decision-making. Behavioral ethics is based on research from such fields as brain science, evolutionary biology, child development, and many others. The most significant contributions come from the discipline of psychology, particularly behavioral psychology.
Unfortunately, the field of psychology has taken a few credibility hits lately that have an impact on behavioral ethics.
We recently read Rutger Bregman’s Humankind: A Hopeful History (2019), in which he argues that, in general, we humans are better beings than is typically assumed. He begins by contrasting the view of humans depicted in William Golding’s Lord of the Flies (1954), a fictional account of a group of young boys who are stranded on an uninhabited island and slowly descend into savagery with a real such event that occurred in 1965 near the South Pacific island of Tonga. Though stranded for several months with little in the way of provisions, these boys banded together and took care of each other. Theirs “is a story of friendship and loyalty,” says Bregman. Reality, says Bregman, is much more hopeful than fiction.
Bregman also attacks two classic psychological experiments, relaying substantial evidence that the experimenters improperly intervened in the experiments in an attempt to produce a result that fit with their pre-existing theories. Bregman takes aim at:
- Stanley Milgram’s famous experiment regarding obedience to authority—the tendency people have to defer to people in a position of authority [see our video at https://ethicsunwrapped.utexas.edu/video/obedience-to-authority ]. The experiment supposedly resulted in 65% or so of a group of average Americans administering painful and perhaps even fatal electric shocks to subjects pleading for mercy just because a stranger in a gray lab coat told them to. Bregman relates evidence that barely half of Milgram’s subjects believed that they were actually administering electric shocks and a majority of those who did think that they were doing so quit the experiment.
- Philip Zimbardo’s infamous 1971 “Stanford Prison Experiment” supposedly demonstrated that when given the power of prison guards, college students given the role of “guards” controlling other students given the role of “prisoners” would abuse the authority supposedly bestowed on them. In reality, Zimbardo and his assistants had to intervene in the experiment to produce this result, and many of the subjects did not take seriously the experiment or their roles in it.
That these two experiments were not all they’ve been made out to be doesn’t mean there is no such phenomenon as obedience to authority (Milgram’s experiment has been replicated repeatedly over the decades by other experimenters) or that people’s decisions are not influenced by the context in which they make them (Zimbardo’s key point, which is supported by mountains of evidence). But it does show that scientists can err just like the rest of us.
More immediately concerning are the recent allegations against Harvard’s Francesca Gino and Duke’s Dan Ariely. In his new book, Inside an Academic Scandal: A Story of Fraud and Betrayal (2025), Harvard Business School’s Max Bazerman—who co-authored with both Gino and Ariely—examines his own complicity (by being too trusting) in publishing and then not retracting an article that claimed that people were more likely to tell the truth if they signed a statement promising not to lie before they filled out a form rather than after. Bazerman truly believed that this was a real phenomenon, even after some other researchers had difficulty replicating the result. He was horrified when he learned that there was substantial evidence that both Gino and Ariely had committed academic fraud in reporting the results of two different studies upon which the 2012 article relied. This was not their only offense, and Gino had several, though both continue to deny wrongdoing in the face of overwhelming evidence.
For those of us, like Max Bazerman, who teach and write about behavioral ethics, this black eye for psychology research is distressing. Indeed, Bazerman goes into three or four other examples (mostly from Europe), making it clear that Gino and Ariely are not the only offenders. So, the situation is just that much more upsetting.
However, it is also true that the story of Gino and Bazerman illustrates two major lessons of behavioral ethics: (a) good people often do bad things, and (b) often these ethical missteps are caused by social and organizational pressures, cognitive heuristics and biases, and situational factors that can make it difficult for all of us to get it right all the time. Bazerman’s book makes this point.
Why do academics engage in academic fraud? Wrongdoing is often caused by the self-serving bias, our tendency to gather, process, and even remember information in ways that support our self-interest. The more that is at stake, the more we are likely to view a little fudging of rules and norms as permissible. Bazerman quotes a Gino co-author, Bradford Tuckfield, who believes: “The common thread that unites all of these frauds is ambition. Each of them were talented enough to feed their families through honest work, but each wished for more-not just money, but power, influence, fame, and immortality.”
The conformity bias, our tendency to take cues as to how to act from those around us, also leads many people to go off the ethical rails. [https://ethicsunwrapped.utexas.edu/video/conformity-bias] Bazerman quotes philosopher Eric Schwitzgebel as believing that both Gino and an earlier Harvard academic fraudster—evolutionary biologist Marc Hauser (both of whom studied moral decision-making)–might have been influenced by this bias:
If you study dishonesty, you might be struck by the thought that dishonesty is everywhere—and then if you are tempted to be dishonest you might think “well, everyone else is doing it.” I can easily imagine someone in Gino’s position thinking, probably most researchers have from time to time shifted around a few rows of data to make their results pop out better. Is it really so bad if I do it too? And then once the deed has been done, it probably becomes easier, for multiple reasons, to assume that such fraud is widespread and just part of the usual academic game (grounds for thinking this might include rationalization, positive self-illusion, and using oneself as a model for thinking about others).
Then, there’s the matter of framing. How we decide every kind of question, including moral ones, has to do with how we look at it…what is in our frame of reference when we decide. A Tilburg University (Netherlands) academic, Diederik Stapel, committed major academic fraud and later said that he had come to view academia as a business:
There are scarce resources, you need grants, you need money, there is competition. Normal people go to the edge to get that money. Science is of course about discovery, about digging to discover the truth. But it is also communication, persuasion, marketing. I am a salesman. I am on the road. People are on the road with their talk. With the same talk. It’s like a circus.
With this reframing, a search for the truth becomes a competition for money and the fast-and-loose rules of free enterprise may seem more applicable than those of academic integrity. A search for the truth is replaced by a search for profit in the form of grants and awards.
Several of the academics caught in recent scandals, Gino especially, often co-authored with assistant professors and graduate students. By cheating, they helped produce publishable papers that…at least until the fraud was uncovered…helped the young scholars’ careers. A study, admittedly by Gino and Ariely with Shahar Ayal, concluded that a phenomenon that we at Ethics Unwrapped call “altruistic cheating” enables cheaters to rationalize their actions as helping others and therefore justifiable:
Individuals cheat more when others can benefit from their cheating and when the number of beneficiaries of wrongdoing increases. Our results indicate that people use moral flexibility to justify their self-interested actions when such actions benefit others in addition to the self. Namely, our findings suggest that when people’s dishonesty would benefit others, they are more likely to view dishonesty as morally acceptable and thus feel less guilty about benefiting from cheating.
It appears that the article in which this study was reported has not been retracted (at least not yet), and there is plenty of other evidence documenting the existence of this rationalization.
Situational factors such as time pressure and fatigue can also incline good people toward bad things. Hard-working academics may find themselves not only fully engaged in research work, but also teaching, undertaking major administrative posts, consulting with outside industry, and so forth, all while raising a family. Bazerman notes:
Earlier in the book, I documented Gino’s high level of career activity, which some viewed as overcommitment. The same was apparently true of Ariely. This makes the following conclusion of Celia Moore and Gino interesting:
The amount we have on our plate also affects the attention we can direct toward moral decisions. When our bandwidth is subsumed by other activities, our attention to potential moral implications may be lost.
Despite widespread speculation to the contrary, academics are human beings and subject to the same social and organizational pressures, cognitive heuristics and biases, and situational factors that cause others to make poor moral choices. This is embarrassing to the academy, humiliating in fact. But it should not be surprising. And it is a clarion call that Bazerman takes up in the latter chapters of his book, for universities and academic journals and, really, all research faculty to make changes that will discourage academic fraud, investigate allegations of cheating, and ensure adequate punishment.
Even if a concerted attack on academic fraud is launched, the minefield of social and organizational pressures, cognitive heuristics and biases, and various situational factors (like overcommitment) may lead to moral errors by well-meaning people. But teaching about behavioral ethics at least gives students advance notice of the pitfalls that they must monitor and do their best to navigate safely.
Sources:
Max Bazerman, Inside an Academic Scandal: A Story of Fraud and Betrayal (2025).
Max Bazerman, Complicit: How We Enable the Unethical and How to Stop (2022).
Yudhjjit Bhattacharjee, “The Mind of a Con Man,” New York Times Magazine, April 26, 2013.
Rutger Bregman, Humankind: A Hopeful History (2020).
Daniel Engber, “The Business School Scandal that Just Keeps Getting Bigger,” The Atlantic, Nov. 19, 2024.
Andrew Gelman and Andrew King, “Social Science is Broken. Here’s How to Fix It,” Chronicle of Higher Education, Feb. 25, 2025.
Francesca Gino et al., “Self-serving Altruism? The Lure of Unethical Actions That Benefit Others,” Journal of Economic Behavior & Organization, Vol. 93, pp. 285-292 (2013).
Andrew King, “What Turns Some Scholars into Frauds?” Chronicle of Higher Education, Sept. 11, 2025.
Gideon Lewis-Kraus, “They Studied Dishonesty. Was Their Work a Lie?” The New Yorker, Sept. 30, 2023.
Ruth Leys, Anatomy of a Train Wreck: The Rise and Fall of Priming Research (2024).
Eric Schwitzgebel, https://schwitzsplinteres.blogspot.com/search?q=Gino.
Lisa Shu, Nina Mazar, Francesca Gino, Dan Ariely, and Max Bazerman, “Signing at the Beginning Makes Ethics Salient and Decreases Dishonest Self-reports in Comparison to Signing at the End,” PNAS 109: 15197-15200 (2012) [Retracted in 2021].
Celia Moore & Francesca Gino, “Approach, Ability, Aftermath: A Psychological Process Framework of Unethical Behavior at Work,” Academy of Management Annals, Vol. 9: 235-289 (2015).
Shaked Shuster et al., “Proud to be Dishonest: Emotional Consequences of Altruistic Versus Egoistic Dishonesty,” Journal of Behavioral Decision Making, Vol. 37, p. e2386 (2024).
Bradford Tuckfield, “Making It (Up) in America,” The American Conservative, Sept. 13, 2023.
Philip Zimbardo, The Lucifer Effect: Understanding How Good People Turn Evil (2007).
Videos:
Conformity bias: https://ethicsunwrapped.utexas.edu/video/conformity-bias.
Framing: https://ethicsunwrapped.utexas.edu/video/framing.
Obedience to Authority: https://ethicsunwrapped.utexas.edu/video/obedience-to-authority.
Self-serving Bias: https://ethicsunwrapped.utexas.edu/video/self-serving-bias.
Blog Posts:
“Academic Dishonesty in Ethics Studies,” at https://ethicsunwrapped.utexas.edu/academic-dishonesty-in-ethics-studies.