Ethics Unwrapped Blog

Confirmation Bias

Confirmation bias is the tendency of people’s minds to seek out information that supports the views they already hold. It also leads people to interpret evidence in ways that support their pre-existing beliefs, expectations, or hypotheses.

People easily accept new information that is consistent with their beliefs, but are skeptical of information that contradicts their beliefs. In one study, teachers were told that certain students were especially promising… even though the students were really chosen at random. Based on this false belief, teachers gave more praise and attention to the chosen students… who improved more because of the teachers’ expectations. In other words, the confirmation bias can create self-fulfilling prophecies.

For example, when physicians have an idea about a patient’s diagnosis, they may focus on evidence that supports their theory while they undervalue evidence that supports an equally plausible alternative diagnosis.

Likewise, police officers who accept stereotypes that link young black men to crime may gather and process clues in a one-sided way when investigating a crime with a black suspect. As Nobel-prize winner Daniel Kahneman warns, even scientists who commit to a theory tend to disregard inconsistent facts, concluding that the facts are wrong, not the theory.

So the confirmation bias can easily lead us to reach inaccurate –and even unethical– conclusions. It’s essential to recognize our vulnerability to confirmation bias, and actively guard against it by being open to evidence that is not consistent with our beliefs and theories.

Continue Reading

Cognitive Bias

People generally believe that they are mostly rational in their thinking, decisions, and actions. But even the smartest and best educated people often commit cognitive errors as they make financial, medical, personal and ethical decisions. These errors in thinking, also called cognitive bias, affect all people in virtually every situation.

For example, physicians must be aware of the error of overconfidence bias as they make diagnoses which could cause them to insufficiently value other doctors’ opinions. Likewise, physicians (and everyone else) must watch out for confirmation bias, which is the tendency people have to process new information in a way that is heavily influenced by their existing beliefs.

The anchoring effect is another bias in thinking whereby people’s initial focus on a particular fact or number means that they fail to properly adjust their judgments as new and different information arises. There is also the cognitive error of overgeneralization, which is the tendency to jump to a broad conclusion based on a single piece of evidence.

People are influenced in differing degrees by these (and many other) cognitive biases. Studies show that some errors in thinking can be moderated with education. For example, physicians can learn to recognize cognitive biases and so reduce their diagnostic mistakes.

But even with effort, none of us will escape cognitive errors altogether. Knowing your brain is biased is critical to making it work better for you, and everyone around you.

Continue Reading

Tangible & Abstract

The bias of tangible and abstract describes the fact that people are influenced more by what is immediately observable than by factors that are hypothetical or distant, such as something that could happen in the future or is happening far away.

For example, people may make decisions about natural resources without adequately considering the impact those decisions may have on future generations, or on people in other countries.

In a famous example, the Pinto automobile flunked almost every routine safety test involving rear-end collisions, but Ford put it on the market anyway in 1971. The company was racing to get a small car on the market to challenge popular Japanese imports. Ford decided not to withhold the car from the market to avoid the immediate negative consequences of delaying, like a stock price hit, employee layoffs, and a public relations crisis.  All those factors were very tangible for Ford.  The considerations against selling the car were much more removed and abstract. For example, any potential crash victims, at that point, were nameless and faceless. Their injuries would occur, if ever, off in the future, and they would likely be someone else’s worry.

So, the principle of the tangible and abstract underscores how we can become blind to the negative consequences of our actions. Indeed, we make moral errors by discounting factors outside our immediate frame of reference.

Continue Reading

Self-Serving Bias

The self-serving bias is the tendency people have to seek out information and use it in ways that advance their self-interest. In other words, people often unconsciously make decisions that serve themselves in ways that other people might view as indefensible or unethical.

Studies show that we can easily see how the self-serving bias affects others’ actions, but we have difficulty realizing how it affects our own.

For example, doctors tend to believe that they are immune from the influence of gifts they receive from pharmaceutical companies. But studies show those gifts have a significant effect on what medications doctors prescribe. One study found that 64% of doctors believed that the freebies they received from pharmaceutical companies influenced other doctors. However, only 16% of doctors thought it affected their own actions.

So, the self-serving bias often blinds us to the ways in which we are prejudice in favor of ourselves. Indeed, it can cause even the most well-intentioned of us to completely overlook our own bad actions.

Continue Reading

Role Morality

Role morality is the notion that people sometimes fail to live up to their own ethical standards because they see themselves as playing a certain role that excuses them from those standards.

For example, say a person views herself as a loyal employee of a company. In that role, she might act unethically to benefit her employer in ways that she would never do to help herself. To paraphrase researcher Keith Levitt, the same person may make a completely different decision based on what hat – or occupational role – she may be wearing at the time, often without even realizing it.

In one study people were asked to judge the morality of a company selling a drug that caused unnecessary deaths when its competitors’ drugs did not. 97% of people concluded that it would be unethical to sell the drug. Then, the researchers placed different people into groups, and asked each group to assume the role of the company’s directors. Acting as directors, every one of the 57 groups decided to sell the drug. They framed the issue as a business decision in dollars-and-cents terms. They ignored the harmful impact their decision would have on others.

So, ethical behavior requires maintaining the same moral standards regardless of the roles we play at home, at work, or in society.

Continue Reading


Rationalizations are invented explanations that hide or deny true motivations, causes, or actions. They are the excuses people give themselves for not living up to their own ethical standards.

For example, most of us think of ourselves as honest people, yet studies show that most of us often lie a little or cheat a little. In order to maintain our self-image as good people, we unconsciously invent rationalizations to convince ourselves that what we did was not wrong, not really harmful, not our fault, and so on.

According to Vikas Anand and his colleagues, common rationalizations include: “I know I shouldn’t have done that, but my boss made me so I didn’t have any choice.”  Or, “Others have done worse.”  Or, “That guy deserved to get ripped off.”  Or, “If I hadn’t done it, someone else would have.”

Generally, rationalizations are most effective when they are not recognized as rationalizations. They are dangerous because people are very creative rationalizers and, indeed,, often come to believe their own excuses. As psychologist Joshua Greene notes, “rationalization is the great enemy of moral progress.”

Ultimately, rationalizations dull our sense of responsibility for our wrongful actions. So, if we wish to truly be ethical people, we must carefully and consistently monitor our own rationalizations.

Continue Reading

Overconfidence Bias

The overconfidence bias is the tendency people have to be more confident in their own abilities, such as driving, teaching, or spelling, than is objectively reasonable. This overconfidence also involves matters of character.

Generally, people believe that they are more ethical than their competitors, co-workers, and peers. For example, a recent study showed that 50% of business people polled believed that they were in the top 10% ethically.

Because of the overconfidence bias, people will often take ethical issues lightly. They simply assume that they have good character and will therefore do the right thing when they encounter ethical challenges. In fact, studies show that the overconfidence bias causes people to overestimate how much, and how often, they will donate money or volunteer their time to charities.

So, overconfidence in our own moral character can cause us to act without proper reflection. And that is when we are most likely to act unethically.

Continue Reading

Obedience to Authority

Obedience to authority is the tendency people have to try to please those in charge.

Psychological evidence indicates that people tend to respect and follow those whom they perceive to have legitimate authority. This can lead to trouble if it causes people to fail to exercise their own independent ethical judgment.

Most people can anticipate their superiors’ desires and may act to please them even without being explicitly asked. For example, when Toshiba needed to inflate its earnings, implicit pressure from top officers was sufficient to induce division managers to misreport their earnings.

Likewise, when Lance Armstrong, leader of his cycling team, signaled that the team needed to dope in order to successfully compete, his supporting riders quickly complied.

So, a willingness to follow instructions is generally a good thing. But blind obedience to those in charge can have unfortunate consequences when leaders lack ethical conviction themselves.

Continue Reading


Neuroethics refers to the research on ethics done within the field of neuroscience. Neuroethics can also refer to the ethical issues that may arise in the research and study of neuroscience. Neuroscience is the study of the nervous system and the brain.

The field of neuroethics is relatively new, and its findings are far from settled. It examines the brain in relationship to questions like “Is there free will?” and “Is the human moral sense innate, or in other words, ‘hardwired’ in the brain?”

Research in neuroscience shows that the way the brain is wired has much to do with how and why people make moral decisions. In fact, neuroscience shows that a network of various regions of the brain is consistently involved in moral decision-making.

So, while ethics and morality were once exclusively within the province of philosophers and theologians, future research in neuroscience may contribute greatly to the resolution of key questions in these areas.

Continue Reading

Moral Reasoning

Moral reasoning applies critical analysis to specific events to determine what is right or wrong, and what people ought to do in a particular situation. Both philosophers and psychologists study moral reasoning.

How we make day-to-day decisions like “What should I wear?” is similar to how we make moral decisions like “Should I lie or tell the truth?” The brain processes both in generally the same way.

Moral reasoning typically applies logic and moral theories, such as deontology or utilitarianism, to specific situations or dilemmas. However, people are not especially good at moral reasoning. Indeed, the term moral dumbfounding describes the fact that people often reach strong moral conclusions that they cannot logically defend.

In fact, evidence shows that the moral principle or theory a person chooses to apply is often, ironically, based on their emotions, not on logic. Their choice is usually influenced by internal biases or outside pressures, such as the self-serving bias or the desire to conform.

So, while we likely believe we approach ethical dilemmas logically and rationally, the truth is our moral reasoning is usually influenced by intuitive, emotional reactions.

Continue Reading

Moral Myopia

Moral myopia refers to the inability to see ethical issues clearly.

The term, coined by Minette Drumwright and Patrick Murphy, describes what happens when we do not recognize the moral implications of a problem or we have a distorted moral vision. An extreme version of moral myopia is called moral blindness.

For example, people may become so focused on other aspects of a situation, like pleasing their professor or boss or meeting sales targets, that ethical issues are obscured.

Organizations can experience moral myopia too, as Major League Baseball did during the steroid era. For more than a decade, players got bigger, hit more home runs, and revenues rose dramatically. But the League didn’t see it, even as evidence of steroid use was rampant.

Societies may also suffer moral myopia, as they often have done at the expense of minorities. For instance, the treatment of Native Americans and the enslavement of African-Americans are two examples of moral blindness in the history of the United States.

Moral myopia is closely related to ethical fading. In both cases, people’s perception of reality becomes altered so that ethical issues are indistinct and hidden from view.

Continue Reading

Moral Muteness

Moral muteness occurs when people witness unethical behavior and choose not to say anything. It can also occur when people communicate in ways that obscure their moral beliefs and commitments.

When we see others acting unethically, often the easiest thing to do is look the other way. Studies show that less than half of those who witness organizational wrongdoing report it. To speak out risks conflict, and we tend to avoid conflict because we pay an emotional and social cost for it.

For example, in one study, psychologist Harold Takooshian planted fur coats, cameras, and TVs inside 310 locked cars in New York City. He sent a team of volunteers to break into the cars and steal the valuables, asking the “thieves” to act in an obviously suspicious manner. About 3,500 people witnessed the break-ins, but only 9 people took any kind of action. Of those who spoke up, five were policemen.

Indeed, only a relatively small percentage of people who see wrongdoing speak up. But, if we wish to be ethical people, we must strive to combat moral muteness in all areas of our lives.

Continue Reading

Moral Equilibrium

Moral equilibrium is the idea that most people keep a running mental scoreboard where they compare their self-image as a good person with what they actually do.

When we do something inconsistent with our positive self-image, we naturally feel a deficit on the good side of our scoreboard. Then, we will often actively look for an opportunity to do something good to bring things back into equilibrium. This is called moral compensation.

Conversely, when we have done something honorable, we feel a surplus on the good side of our mental scoreboard. Then, we may then give ourselves permission not to live up to our own ethical standards. This is called moral licensing.

For example, Oral Suer, the hard-working CEO of the Washington D.C.-area United Way, raised more than $1 billion for local charities. Unfortunately, Suer gave himself license to divert substantial sums intended for the charity for his personal use to reward himself for his good deeds.

So, our tendency to maintain moral equilibrium may mean that we will act unethically. Indeed, we must guard against our natural inclination to give ourselves permission to depart from our usual moral standards.

Continue Reading

Moral Emotions

Emotions – that is to say feelings and intuitions – play a major role in most of the ethical decisions people make. Most people do not realize how much their emotions direct their moral choices. But experts think it is impossible to make any important moral judgments without emotions.

Inner-directed negative emotions like guilt, embarrassment, and shame often motivate people to act ethically.

Outer-directed negative emotions, on the other hand, aim to discipline or punish. For example, people often direct anger, disgust, or contempt at those who have acted unethically. This discourages others from behaving the same way.

Positive emotions like gratitude and admiration, which people may feel when they see another acting with compassion or kindness, can prompt people to help others.

Emotions evoked by suffering, such as sympathy and empathy, often lead people to act ethically toward others. Indeed, empathy is the central moral emotion that most commonly motivates prosocial activity such as altruism, cooperation, and generosity.

So, while we may believe that our moral decisions are influenced most by our philosophy or religious values, in truth our emotions play a significant role in our ethical decision-making.

Continue Reading

Moral Cognition

Moral cognition is the study of the brain’s role in moral judgment and decision-making. As a social science, it involves understanding the rationalizations and biases that affect moral decision-making. Moral cognition also involves the scientific study of the brain that is evolving along with technology.

Researchers who study moral cognition attempt to provide social and biological explanations for how our brains process information and make moral or immoral choices. Some scientist examine genetic and molecular influences, while others use neuroimaging to map the areas of the brain that direct people’s choices.

Moral thinking appears to be a complicated process. There is no single seat of moral activity in the brain. However, a network of various regions of the brain does consistently appear to be involved in moral decision-making.

So, the study of moral cognition does not aim to tell people what choices they should make. Rather, it attempts to explain how and why people make the moral choices that they do.

Continue Reading

Loss Aversion

Loss aversion is the notion that people hate losses more than they enjoy gains.

Studies show that people are more likely to lie and cheat to avoid losing something they already have than to acquire it in the first place. For example, say a person makes an innocent mistake. Then, to avoid injury to her reputation, she may intentionally lie to cover it up.

Loss aversion seemed to play a significant role in the General Motors scandal in 2014. For more than a decade, the company failed to recall cars with faulty ignition switches. Even as evidence of the defect grew, GM officials continued to deny that there was a problem to avoid the expense and embarrassment of a massive recall.

The desire to keep what one already has can be overwhelming. Indeed, the natural aversion to loss can lead us to make unethical and illegal choices that, ironically, might be more costly in the long run.

Continue Reading


Incrementalism is the slippery slope that often causes people to slide unintentionally into unethical behavior. It can happen when people cut small corners that become bigger over time.  For example, almost every instance of accounting fraud begins with people fudging small numbers that grow larger and larger.

People’s brains are not adept at perceiving small changes. In addition, continued exposure to unethical behavior is desensitizing and makes those activities seem routine. Indeed, we can easily lose sight of the fact that those activities are immoral and possibly illegal.

Wrongdoers, and people in general, may never even realize that they are making a life-changing decision when they make small, unethical choices. But in truth, as philosopher Jonathan Glover says, incrementalism is how we “slide into participation by imperceptible degrees so that there is never the sense of a frontier being crossed.”

Continue Reading


A frame of reference, or point of view, refers to the way we look at a given situation. How a person views that situation can affect her understanding of the facts and influence how she determines right from wrong.

Some frames minimize or even omit the ethical aspects of a decision. For example, studies show that if people are prompted to frame a situation only in terms of money or economic interests, they often leave out ethical considerations.

In a famous study, a day care center having difficulty with parents picking up their children on time started charging a fine for being late. Parents then reframed the issue from an ethical one (“It’s not nice of me to burden the staff in this way”) to a business one (“I can buy the staff’s time by paying this fine”). Late pick-ups increased rather than decreased due to this change in the parents’ frame of reference.

So, by remembering to consider the ethical implications of any situation, we can keep ethics in our frame of reference when making decisions.

Continue Reading

Fundamental Attribution Error

The fundamental attribution error is the tendency people have to overemphasize personal characteristics and ignore situational factors in judging others’ behavior. Because of the fundamental attribution error, we tend to believe that others do bad things because they are bad people. We’re inclined to ignore situational factors that might have played a role.

For example, if someone cuts us off while driving, our first thought might be “What a jerk!” instead of considering the possibility that the driver is rushing someone to the airport. On the flip side, when we cut someone off in traffic, we tend to convince ourselves that we had to do so.  We focus on situational factors, like being late to a meeting, and ignore what our behavior might say about our own character.

For example, in one study when something bad happened to someone else, subjects blamed that person’s behavior or personality 65% of the time. But, when something bad happened to the subjects, they blamed themselves only 44% of the time, blaming the situation they were in much more often.

So, the fundamental attribution error explains why we often judge others harshly while letting ourselves off the hook at the same time by rationalizing our own unethical behavior.

Continue Reading

Ethical Fading

Ethical fading occurs when the ethical aspects of a decision disappear from view.

This happens when people focus heavily on some other aspect of a decision, such as profitability or winning. People tend to see what they are looking for, and if they are not looking for an ethical issue, they may miss it altogether.

Psychologist Anne Tenbrunsel and colleagues find that innate psychological tendencies often cause us to engage in self-deception, which blinds us to the ethical components of a decision. For example, euphemisms like “We didn’t bribe anyone… we just ‘greased the wheels,’” help people disguise and overlook their own wrongdoing.

Ethical fading is similar to moral disengagement. Moral disengagement is when people restructure reality in order to make their own actions seem less harmful than they actually are. Both ethical fading and moral disengagement help people minimize the guilt they feel from violating ethical standards.

So, while ethical fading is common, we can try to counteract it by learning to recognize when we put ethical concerns behind other factors in making decisions.

Continue Reading

Conformity Bias

The conformity bias is the tendency people have to behave like those around them rather than using their own personal judgment.

People seem to be more comfortable mimicking others, even regarding ethical matters.

For example, studies show that people are more likely to act in a prosocial manner, such as contributing to charity or conserving water, if they see or hear that others are doing it too. Knowing that those around us are making an ethical choice indicates it’s the social norm, and makes it easier for us to follow suit.

Unfortunately, the flip side is also true. As psychologist Dan Ariely notes, “Cheating is contagious. When we see others succeed by cheating, it makes us more likely to cheat as well.”

Indeed, the conformity bias can cause people to simply follow the herd rather than use their own independent ethical judgment.

Continue Reading

Bounded Ethicality

Bounded ethicality is the idea that our ability to make ethical choices is often limited or restricted because of internal and external pressures.

Most people are usually ethical, but not completely so. Just like people are generally rational, but not as completely logical as Spock from Star Trek. Our ability to be ethical seems to have limits.

For example, outside pressures, such as the tendency to conform to the actions of those around us, can make it hard to do the right thing.  So can internal biases, such as the self-serving bias, which often causes us to subconsciously favor ourselves at the expense of others.

It’s important to understand that everyone is bounded ethically, even Mother Teresa. Indeed, we are all susceptible to the cognitive biases and organizational or social pressures that limit our abilities to make ethical decisions.

Continue Reading

Behavioral Ethics

Behavioral ethics is the study of why people make the ethical and unethical decisions that they do. Its teachings arise from research in fields such as behavioral psychology, cognitive science, and evolutionary biology.

Behavioral ethics is different from traditional philosophy. Instead of focusing on how people ought to behave, behavioral ethics studies why people act as they do. Arguably, it is more useful to understand our own motivations than to understand the philosophy of Aristotle.

Research in behavioral ethics finds that people are far from completely rational. Most ethical choices are made intuitively, by feeling, not after carefully analyzing a situation. Usually, people who make unethical decisions are unconsciously influenced by internal biases, like the self-serving bias, by outside pressures, like the pressure to conform, and by situational factors that they do not even notice.

So, behavioral ethics seeks to understand why even people with the best intentions can make poor ethical choices.

Continue Reading