Skip to main content

Implicit Bias

Implicit bias exists when people unconsciously hold attitudes toward others or associate stereotypes with them.

Discussion Questions

1. In a presentation, Professor Will Cox shows two news photos published in the wake of Hurricane Katrina.  One shows a young black man walking through swirling water holding a carton of soda.  The other shows a white couple in similar water, holding a bag of bread.  The caption for the photos read, respectively:  “A young man walks through chest-deep water after looting a grocery store” and “Two residents wade through chest-deep water after finding bread and soda.”  Do you think the writers of these captions thought of themselves as racist?

2. Studies show that Latinos receive less pain medication than similarly-situated white patients, that elderly women receive fewer life-saving interventions than elderly men, and that obese children are more likely to be assumed by teachers to be less intelligent than slim children.  Are these examples of implicit bias?

3. Can you think of examples of implicit bias?

4.Do you think that implicit bias is a serious problem?  If so, is it more serious than explicit bias?

Meet Me at Starbucks

Meet Me at Starbucks

Two black men were arrested after an employee called the police on them, prompting Starbucks to implement “racial-bias” training across all its stores.

View

Teaching Notes

This video introduces the behavioral ethics bias known as implicit bias.  Implicit bias exists when people unconsciously hold attitudes toward others or associate stereotypes with them.  People’s implicit bias may run counter to their conscious beliefs without their realizing it.  Implicit bias is also known as unconscious bias or implicit social cognition.

In teaching implicit bias, it may be a good idea to send your students to Harvard’s Project Implicit website so that they themselves may take one or more of the tests that provide some (not all) of the academic evidence for implicit bias:  https://implicit.harvard.edu/implicit/.  Be aware that while there is widespread acceptance of the phenomenon of implicit bias, there is criticism of the evidence coming from this particular set of tests.  Two of the academics behind Project Implicit, Mahzarin R. Banaji and Anthony G. Greenwald wrote Blind Spot: Hidden Biases of Good People (Delacorte Press 2013), which is a very accessible book on the topic.  Another very accessible source is a March 27, 2018 article in Scientific American by psychologists Keith Payne, Laura Niemi, and John Doris entitled “How to Think about “Implicit Bias,” at https://www.scientificamerican.com/article/how-to-think-about-implicit-bias/.

Accepting the fact of implicit bias should not be too difficult if you view our Concepts Unwrapped videos on several other kinds of bias generated by the human brain, including the Conformity Bias, the Self-Serving Bias, and the Overconfidence Bias.

An important issue in the ethics of “Big Data” and artificial intelligence has to do with the implicit bias that might affect the creating of algorithms and the analysis of data.  Cathy O’Neil discusses this at length in her book Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (Broadway Books 2016).  A quick search of the internet on the topic “implicit bias and artificial intelligence” will bring up many sources for a fruitful discussion of this important topic.  One controversial example is the fact that some versions of facial recognition software seem to be much more accurate when assessing white male faces than black female faces.  This can have significant consequences when such software is deployed in the real world by police officers, insurance companies and others.

An interesting article is WGBH News “Addressing Gender and Racial Bias in Facial Recognition Technology, available at: https://www.wgbh.org/news/2018/03/21/science-and-technology/addressing-gender-and-racial-bias-facial-recognition-technology.

The “Meet Me at Starbucks” case study that accompanies this video seems to be an informative example of implicit bias in action and an inspiring example of a company trying in good faith to at least begin to address the problem.

Additional Resources

The latest resource from Ethics Unwrapped is a book, Behavioral Ethics in Practice: Why We Sometimes Make the Wrong Decisions, written by Cara Biasucci and Robert Prentice. This accessible book is amply footnoted with behavioral ethics studies and associated research. It also includes suggestions at the end of each chapter for related Ethics Unwrapped videos and case studies. Some instructors use this resource to educate themselves, while others use it in lieu of (or in addition to) a textbook.

Cara Biasucci also recently wrote a chapter on integrating Ethics Unwrapped in higher education, which can be found in the latest edition of Teaching Ethics: Instructional Models, Methods and Modalities for University Studies. The chapter includes examples of how Ethics Unwrapped is used at various universities.

The most recent article written by Cara Biasucci and Robert Prentice describes the basics of behavioral ethics and introduces Ethics Unwrapped videos and supporting materials along with teaching examples. It also includes data on the efficacy of Ethics Unwrapped for improving ethics pedagogy across disciplines. Published in Journal of Business Law and Ethics Pedagogy (Vol. 1, August 2018), it can be downloaded here: “Teaching Behavioral Ethics (Using “Ethics Unwrapped” Videos and Educational Materials).”

An article written by Ethics Unwrapped authors Minette Drumwright, Robert Prentice, and Cara Biasucci introduce key concepts in behavioral ethics and approaches to effective ethics instruction—including sample classroom assignments. Published in the Decision Sciences Journal of Innovative Education, it can be downloaded here: “Behavioral Ethics and Teaching Ethical Decision Making.”

A detailed article written by Robert Prentice, with extensive resources for teaching behavioral ethics, was published in Journal of Legal Studies Education and can be downloaded here: “Teaching Behavioral Ethics.”

Another article by Robert Prentice, discussing how behavioral ethics can improve the ethicality of human decision-making, was published in the Notre Dame Journal of Law, Ethics & Public Policy. It can be downloaded here: “Behavioral Ethics: Can It Help Lawyers (And Others) Be their Best Selves?

A dated (but still serviceable) introductory article about teaching behavioral ethics can be accessed through Google Scholar by searching: Prentice, Robert A. 2004. “Teaching Ethics, Heuristics, and Biases.” Journal of Business Ethics Education 1 (1): 57-74.

Transcript of Narration

Written and Narrated by

Robert Prentice,
J.D.
Business,
Government & Society Department
McCombs School of Business The University of Texas at Austin

“Implicit bias” exists when we unconsciously hold attitudes towards others or associate stereotypes with them.  For example, we often harbor negative stereotypes about others without consciously realizing that we do so.

Implicit bias, also called “unconscious bias” or “implicit social cognition,” is a prejudice that is deep-seated within the brain, below the conscious level.  Studies have demonstrated implicit bias against racial groups, genders, LGBTQ, and other marginalized groups.  We may even be prejudiced against our own group, although we tend to favor our in-group with positive stereotypes and disfavor out-groups with negative stereotypes.

Implicit bias often runs counter to people’s conscious, expressed beliefs.  Few physicians espouse racially discriminatory views, yet doctors tend to recommend less pain medication for black patients than for white patients with the identical injury.  In other words, people can be explicitly unbiased, yet implicitly biased, according to psychologist Daniel Kelly and colleagues.

Implicit bias has been found in a large array of studies using various tests, but much of the evidence for the phenomenon comes from Harvard University’s Project Implicit website, home of the Implicit Association Test , or the IAT.  Literally millions of people have visited this site and taken various tests that ask them to respond rapidly to questions that require them to associate blacks or whites, or males or females, or young or old, etc., with positive or negative words.  Professor Nosek and colleagues tested more than 700,000 subjects and found that more than 70% of white subjects more easily associated white faces with positive words and black faces with negative words, concluding that this was evidence of implicit racial bias.  In fact, additional evidence indicates that measures of implicit bias better predict people’s conduct than measures of explicit bias.

Here is some good news.  Various scientists have criticized the IAT.  They point out, for example, that individuals who take the test on different dates often score substantially differently.  Even IAT supporters admit that implicit bias, at least as demonstrated by the test, is widespread but relatively minor and has only a small impact upon people’s real-world actions.  In other words, the results of the test are not strong enough to predict particular behaviors by individual people.
However, let’s not get too comfortable.  Even if the IAT cannot predict the future conduct of any one individual on a given occasion, it still indicates how groups of people will act on average, and that is worrisome.
For example, few people openly advocate for discrimination in hiring, but white applicants receive 50% more responses from potential employers than do black applicants with the same resume. Likewise, college professors are substantially more likely to answer student e-mails if the students’ names indicates that they are probably white than if the names sound like they belong to black students. And in one study online course instructors were 94% more likely to respond to discussion posts by white male students than by other students.

Because implicit bias operates at a mostly unconscious level, it is difficult for individuals to overcome. No existing training regime has proven particularly effective at de-biasing implicit bias. But, fortunately, some research shows that stereotypes can be unlearned and that safeguards can be put in place to minimize their impact. For example, women used to make up only a relatively small percentage of the musicians in orchestras. But when orchestras began holding blind auditions where the applicants played behind a curtain and their genders were unknown to the judges, the percentage of women chosen to play in symphony orchestras doubled.
Perhaps we can put a dent in implicit bias after all.

Bibliography

Mahzarin Banaji & Anthony Greenwald, Blind Spot: Hidden Biases of Good People (2013).

Marianne Bertrand et al., Implicit Discrimination, 95 American Economic Review 94 (2005).

Jack Glaser & Eric Knowles, Implicit Motivation to Control Prejudice, 44 Journal of Experimental Social Psychology 164 (2008).

A.R. Green et al., Implicit Bias Among Physicians and Its Prediction of Thrombolysis Decisions for Black and White Patients, 22 Journal of General Internal Medicine 1231 (2007).

Anthony Greenwald et al., Statistically Small Effects of the Implicit Association Test Can Have Societally Large Effects, 108 Journal of Personality and Social Psychology 553 (2015).

Jerry Kang et al., Implicit Bias in the Courtroom, 59 UCLA Law Review 1124 (2012).

Daniel Kelly et al., Race and Racial Cognition, in The Moral Psychology Handbook 433 (John M. Doris, ed. 2010).

Gregory Mitchel, “An Implicit Bias Primer,” (March 2018), available at https://ssrn.com/abstract=3151740.

Brian Nosek, The Implicit Association Test at Age 7: A Methodological and Conceptual Review, in Automatic Processes in Social Thinking and Behavior (J.A. Bargh, ed. 2007).

Keith Payne et al., How to Think About “Implicit Bias,” Scientific American, March 27, 2018.

Jesse Singal, Psychology’s Favorite Tool for Measuring Racism Isn’t Up to the Job, New Yorker, Jan. 11, 2017.

Shares