A Collection of Strange Beliefs, Amusing Deceptions, and Dangerous Delusions

From Abracadabra to Zombies | View All

representativeness bias

Most important human judgments are made under conditions of uncertainty. We use heuristics, or rules of thumb, to guide us in such instances as we try to determine what belief or action has the highest probability of being the correct one in a given situation. These rules of thumb are often instinctive and irrational. Social psychologists such as Thomas Gilovich, Daniel Kahneman, and Amos Tversky have studied several important heuristics and discovered errors associated with their use. One of these heuristics is the representativeness heuristic. In judging items, we compare them to a prototype or representative idea and tend to see them as typical or atypical according to how they match up with our model. At its worst, representativeness is a bias based on a false or questionable stereotype that determines many of a person's decisions. For example, some surgeons always recommend spinal fusion for lower back pain (Groopman); some therapists diagnose most of their patients with multiple personality disorder, while other therapists never see a case of MPD in their entire career. Some therapists see childhood sexual abuse or satanic ritual abuse as the prototype for the cause of most psychological problems in adults.

The representativeness bias also manifests itself when we take a few traits or characteristics of someone or something and fit them to a stereotype. For example, when told only that a man is quiet, shy, reserved, and self-effacing, what do you think his likely profession is, salesman or brain surgeon? Most people would probably choose brain surgeon because their stereotype of a salesman is of an outgoing, gregarious person. But the odds of any given man being a salesman are much higher than the odds of being a brain surgeon, so the probability is greater that the fellow is a salesman. This example is representative of the examples of representativeness presented by Tversky and Khaneman. In real life, a snap judgment about a person's occupation based on knowing about that person only a few personality traits would be to draw a hasty conclusion. A reflective judgment would require at least some assessment of the accuracy of the reported traits (how reliable is the source of information?) and the base rate for salesmen and brain surgeons. Most people would not know what percentage of the population are salesmen and what percentage are brain surgeons, but most people would know that the base rate for salesmen is much higher than that for brain surgeons.

When I was in graduate school in the late 1960s at the University of California at San Diego, the department hired its first black teacher. I arrived there about the time Angela Davis was finishing up her work with Herbert Marcuse, and the daily news was filled with stories about civil rights and anti-war protests. Students were coming from all over the world to study political philosophy with Marcuse and Stanley Moore. I remember how most of the graduate students assumed the new black teacher would be coming to teach political philosophy and would be radical like Bobby Seale or Eldridge Cleaver. He didn't fit the model at all. I don't remember his name but I remember he came from Ohio State and his interest was in analytic philosophy, much to our shock and dismay. He didn't fit our stereotype at all.

The representativeness bias is common among true believers in paranormal phenomena. Like the surgeon who recommends the same procedure for all his patients, there are  many people who see  events first as paranormal. If there are paranormal events, they are rare. Yet many people who have experiences that seem strange to them do not attempt to rule out non-paranormal explanations first. The probability is much greater that any given event has a physical or psychological explanation, or is a coincidence or a hoax, than that it is paranormal. The paranormal need not be ruled out, but it shouldn't be the first thing that comes to mind when, say, your computer acts up or the lights flicker.

The problem with the representativeness heuristic is that what appears typical sometimes blinds you to possibilities that contradict the prototype. Jerome Groopman, M.D. gives the example of a doctor who failed to diagnose a cardiac problem with a patient because the patient did not fit the model of a person likely to have a heart attack. The patient complained of all the things a person with angina would complain of, but he was the picture of health. He was in his forties, fit, trim, athletic, worked outdoors, didn't smoke, and had no family history of heart attack, stroke, or diabetes. The doctor wrote off the chest pains the patient complained of as due to overexertion. The next day the patient had a heart attack.

Another example from Groopman illustrates both the representativeness error and the availability error. A patient who appeared to be the model for bulimia, anorexia nervosa, and irritable bowel syndrome was misdiagnosed by some thirty doctors over a period of fifteen years. The more doctors that confirmed the diagnosis, the more available the diagnosis became for the next doctor. But it wasn't until she saw Dr. Myron Falchuk that she found a physician who looked beyond the model that the other doctors had used. Falchuck correctly diagnosed the patient as having celiac disease, an autoimmune disorder (an allergy to gluten) that causes an irritation and distortion in the lining of the bowel, making it nearly impossible for nutrients to be absorbed.

The key to avoiding the representativeness error is to always be open to the possibility that the case before you isn't typical. Force yourself to consider other possibilities. Something may look like a giant airplane flying across the sky, but it may be an illusion caused by having no reference point to correctly estimate the distance between you and the lights you see moving across the sky.

The gambler's fallacy is a type of representativeness error. Because, say, red has come up four times in a row on the roulette wheel, the gambler bets on black because he thinks the odds are against five reds in a row. His model for odds is wrong, however. The ball is as likely to land on red as on black on any given roll (assuming a fair wheel), including on a roll following four reds.

See also affect bias, anchoring effect, availability error, the hidden persuaders, and representativenss bias on the Unnatural Acts blog.

further reading

Ariely, Dan. (2008). Predictably Irrational: The Hidden Forces That Shape Our Decisions. HarperCollins.

Gardner, Daniel. 2008. The Science of Fear: Why We Fear the Things We Shouldn't--and Put Ourselves in Greater Danger. Dutton.

Gilovich, Thomas. 1993. How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life. Free Press.

Gilovich, Thomas. Dale Griffin and Daniel Kahneman. 2002. eds. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge University Press.

Groopman, Jerome. M.D. 2007. How Doctors Think. Houghton Mifflin. My review of this book is here.

Kahneman, Daniel. Paul Slovic, and Amos Tversky. eds. 1982. Judgment Under Uncertainty: Heuristics and Biases Cambridge University Press.

Kida, Thomas. 2006. Don't Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking. Prometheus.

Levine, Robert. 2003. The Power of Persuasion - How We're Bought and Sold. John Wiley & Sons.

Sutherland, Stuart. 1992. rev. 2nd ed. Irrationality. Pinter and Martin.

Last updated 14-Jan-2014

© Copyright 1994-2016 Robert T. Carroll * This page was designed by Cristian Popa.