Written and Narrated by
Robert Prentice,
J.D.Business,
Government & Society Department
McCombs School of Business The University of Texas at Austin
“Implicit bias” exists when we unconsciously hold attitudes towards others or associate stereotypes with them. For example, we often harbor negative stereotypes about others without consciously realizing that we do so.
Implicit bias, also called “unconscious bias” or “implicit social cognition,” is a prejudice that is deep-seated within the brain, below the conscious level. Studies have demonstrated implicit bias against racial groups, genders, LGBTQ, and other marginalized groups. We may even be prejudiced against our own group, although we tend to favor our in-group with positive stereotypes and disfavor out-groups with negative stereotypes.
Implicit bias often runs counter to people’s conscious, expressed beliefs. Few physicians espouse racially discriminatory views, yet doctors tend to recommend less pain medication for black patients than for white patients with the identical injury. In other words, people can be explicitly unbiased, yet implicitly biased, according to psychologist Daniel Kelly and colleagues.
Implicit bias has been found in a large array of studies using various tests, but much of the evidence for the phenomenon comes from Harvard University’s Project Implicit website, home of the Implicit Association Test , or the IAT. Literally millions of people have visited this site and taken various tests that ask them to respond rapidly to questions that require them to associate blacks or whites, or males or females, or young or old, etc., with positive or negative words. Professor Nosek and colleagues tested more than 700,000 subjects and found that more than 70% of white subjects more easily associated white faces with positive words and black faces with negative words, concluding that this was evidence of implicit racial bias. In fact, additional evidence indicates that measures of implicit bias better predict people’s conduct than measures of explicit bias.
Here is some good news. Various scientists have criticized the IAT. They point out, for example, that individuals who take the test on different dates often score substantially differently. Even IAT supporters admit that implicit bias, at least as demonstrated by the test, is widespread but relatively minor and has only a small impact upon people’s real-world actions. In other words, the results of the test are not strong enough to predict particular behaviors by individual people.
However, let’s not get too comfortable. Even if the IAT cannot predict the future conduct of any one individual on a given occasion, it still indicates how groups of people will act on average, and that is worrisome.
For example, few people openly advocate for discrimination in hiring, but white applicants receive 50% more responses from potential employers than do black applicants with the same resume. Likewise, college professors are substantially more likely to answer student e-mails if the students’ names indicates that they are probably white than if the names sound like they belong to black students. And in one study online course instructors were 94% more likely to respond to discussion posts by white male students than by other students.
Because implicit bias operates at a mostly unconscious level, it is difficult for individuals to overcome. No existing training regime has proven particularly effective at de-biasing implicit bias. But, fortunately, some research shows that stereotypes can be unlearned and that safeguards can be put in place to minimize their impact. For example, women used to make up only a relatively small percentage of the musicians in orchestras. But when orchestras began holding blind auditions where the applicants played behind a curtain and their genders were unknown to the judges, the percentage of women chosen to play in symphony orchestras doubled.
Perhaps we can put a dent in implicit bias after all.