Written and Narrated by
Robert Prentice, J.D.
Business, Government & Society Department
McCombs School of Business
The University of Texas at Austin
“Being aware that an issue presents a moral dimension is step one in being your best self. Step 2 is Moral Decision Making. Moral decision making is having the ability to decide which is the right course of action once we have spotted the ethical issue. Sometimes this can be very difficult, as multiple options may seem morally defensible (or, perhaps, no options seem morally acceptable). Sometimes people face difficult ethical choices, and it is hard to fault them too much for making a good faith choice that they think is right but turns out to be wrong. However, most white collar crimes–over-billing, insider trading, paying bribes, fudging earnings numbers, hiding income from the IRS, and most other activities that lead people to end up doing the perp walk on the front page of the business section–do not present intractable ethical conundrums. They are obviously wrong. The problem is not that we haven’t read enough Kant or John Stuart Mill.
More commonly, the problem is that we are unaware of psychological, organizational, and social influences that can cause us to make less than optimal ethical choices. Our ethical decision making is often automatic and instinctive. It involved emotions, not reasoning. When we think that we are reasoning to an ethical conclusion, the evidence shows that we typically are simply rationalizing a decision already made by the emotional parts of our brains.
Our brains’ intuitive system often gets it right, but not universally. So, we should never ignore our gut feelings when they tell us that we are about to do something wrong. But, our intuition does not always choose the ethical path. An important reason that the intuitive/emotional part of our brain errs is the self-serving bias, which often leads us to unconsciously make choices that seem unjustifiable to objective third party observers.
As a simple example, a U.S. News & World Report survey asked some people: “If someone sues you and you win the case, should they pay your legal expenses?” Eighty-five percent of the respondents thought this would be fair. The magazine asked others: “If you sue someone and lose the case, should you pay their costs?” Now, only 44% of respondents agreed, illustrating how our sense of fairness is easily influenced by self-interest. If we are not careful, we will not even notice how the self-serving bias influences our ethical decisions. Authors Bronson and Merryman report that “if you’re a Red Sox fan, watching a Sox game, you’re using a different region of the brain to judge if a runner is safe than you would if you were watching a game between two teams you didn’t care about.” So, how can we combat the self-serving bias?
There is some experimental evidence that if we know about the self-serving bias, we can arm ourselves against it and minimize its effects. We must focus not just on being objective, but on doing what it takes to ensure that others see us as objective. We will naturally judge our own decisions with a sympathetic eye, but we know that others will not necessarily do so. So if we do what it takes to cause objective third parties to trust our judgments, we should go a long way toward overcoming the impact of the self-serving bias.
We should also pay especially close attention to our profession’s code of conduct and our employer’s code of ethics, because such standards are normally aimed primarily at minimizing conflicts of interest and their unconscious impact on our decision making. The self-serving bias is far from the only psychological or organizational factor that can cause us to make the wrong ethical choice, but it is certainly a big one!”