People often puzzle over the question of whether religious people act more ethically than not-so-religious people.  The scientific evidence is certainly mixed.  There is strong evidence that religious people self-report being more ethical than non-religious people, but less evidence that their actions actually match their reports (Xygalatas, 2017).  For example, some studies indicate that “religiosity is not always a predictor of moral kindness” (Wrangham, 2019).  Yet other studies have found a positive correlation between moral behaviors and religiosity (Deckers et al., 2016).  But, on the other hand, terrorists tend to come from religious families (Galinsky & Schweitzer).  At the end of the day, “[a]s far as the empirical research is concerned, the jury is still out on the effects of religious rituals and practices on character improvement” (Miller, 2018).

Still, there is psychological evidence that is relevant.  The self-serving bias inclines most people to act in their perceived self-interest.  That self-serving bias should, therefore, prompt religious folks to act right, since religious people often believe that they will experience a very unsatisfactory sort of afterlife if do not do so.  Logically, they would seem to be more motivated to do the right thing than people who do not share that religious conviction.  Also, there is scientific evidence that people are more likely to do the right thing if they are constantly reminded of their desire to act morally, and weekly religious services and habitual reading of religious texts can provide those reminders.  And, indeed, there is evidence that “religious reminders do have a documented effect on moral behavior” (Xygalatas, 2017,) though the impact is similar whether the person being reminded is religious or nonreligious.

Still, even the most religious among us are only human, and therefore subject to the social and organizational pressures, cognitive heuristics and biases, and situational factors that can lead even good people to do bad things.

An illustration is contained in J.M. Fenster’s new (and depressingly titled) book—Cheaters Always Win.  According to Fenster, two men’s Bible classes—one in Long Beach and one in Kansas City–decided to hold a contest to see which club could boost attendance more.  The goal was a worthy one—as attendance grew, God’s word would be spread to more and more people.  Both groups worked hard to bolster their flocks and had great success.  The classes were soon overflowing.

This was unalloyed good news until the Kansas City class sent a member to Long Beach to investigate the Long Beach class’s tactics.  He hired 20 detectives and soon claimed that in its reports Long Beach was exaggerating its actual attendance figures by a factor of two.

The Long Beach class responded by asserting that the Kansas City class was using attendance figures from two different locations, though the contest was for only one class, not two.

Kansas City countered by claiming that its membership drive had been so successful that its first location was at capacity and a second location was necessary.

Ultimately, it seemed clear that both Bible classes had cheated.  Very unattractive behavior.  And very human.

A viewing of our videos on framing could be helpful here. It appears clear that members of the two classes were focusing on winning a silly contest when they made their action-decisions, rather than focusing on their original, higher purpose—to spread The Word to as many people as possible.  What people have in their frame of reference when they make decisions has everything to do with the choices they make.  When the higher purpose dropped out of these two classes’ frames of reference, they began making very poor choices indeed.

Our videos on the in-group/out-group bias are also relevant here.  The Long Beach Christians began looking at themselves as a group separate and apart from the Kansas City Christians, who similarly went tribal.  It’s bad enough when two groups of differently religious people—say Jews and Christians—begin looking at each other as enemies.  When two groups of exactly the same religion do the same thing, they have seriously gone off the rails.  Yet the psychological evidence shows how easily people can divide themselves into tribes, including religious groups which “invariably have a very strong sense of in-group/out-group morality” (Tomasello, 2016).

The impact of this in-group bias in Long Beach and Kansas City was relatively trivial, but it is dangerous.  “Stripping morality from religion can also occur when a belief system is reduced to a simple group identity.  This kind of ‘us versus them’ mentality can corrupt and radicalize any religious community” (Akyol, 2017).

Our reference group should not be just our Bible class.  Or even our religion.  It should be all of humanity, or even, perhaps, all sentient beings.  The wider a net we can cast, the more moral our decision making will become.



Mustafa Akyol, “Does Religion Make People Moral?” New York Times, Nov. 28, 2017.

Thomas Deckers et al., “Homo Moralis: Personal Characteristics, Institutions, and Moral Decision-Making,” CESIFO-Working Paper No. 5800 (March 2016).

J.M. Fenster, Cheaters Always Win: The Story of America (2019).

Adam Galinsky & Maurice Schweitzer, Friend & Foe: When to Cooperate, When to Compete, and How to Succeed at Both (2015).

Christian B. Miller, The Character Gap: How Good Are We? (2018).

Michael Tomasello, A Natural History of Human Morality (2016).

Richard Wrangham, The Goodness Paradox: The Strange Relationship Between Virtue and Violence in Human Evolution (2019).

Demitris Xygalatas, “Are Religious People More Moral?” The Conversation, Oct. 23, 2017.



Concepts Unwrapped:  Framing

Ethics Defined:  Framing

Ethics Defined:  In-group/Out-group: