A Collection of Strange Beliefs, Amusing Deceptions, and Dangerous Delusions

From Abracadabra to Zombies

Book Review

Thinking, Fast and Slow cover

Thinking, Fast and Slow

by

Daniel Kahneman

 


A remarkable aspect of your mental life is that you are rarely stumped. True, you occasionally face a question such as 17 × 24 = ? to which no answer comes immediately to mind, but these dumbfounded moments are rare. The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way. You like or dislike people long before you know much about them; you trust or distrust strangers without knowing why; you feel that an enterprise is bound to succeed without analyzing it. Whether you state them or not, you often have answers to questions that you do not completely understand, relying on evidence that you can neither explain nor defend. --Daniel Kahneman,Thinking, Fast and Slow, p. 97, Macmillan, Kindle Edition.

While reading Daniel Kahneman's Thinking, Fast and Slow, you might begin to wonder whether he's psychic. He seems to know so much about you, yet he's never met you. Or, if you are familiar with the skeptical literature on subjective validation, Barnum statements, and cold reading, you might think he's taking advantage of some human tendencies to find personal meaning and significance in impersonal statements. The truth is, what he knows about you is based on scientific studies of people like you and me. And, as is often the case when someone holds up a mirror to our minds, what we see is not what we are willing to accept. The science and the mirror must be distorting things. That's not me, that's Fred and Fran from the office. That's just about everybody else I meet, but it's not me. When my intuition tells me I'm right and the studies tell me I'm wrong, the studies are clearly flawed, but when other people do it, they're deceiving themselves. It's so simple. I don't see why these scientists don't get it!

In Unnatural Acts: Critical Thinking, Science, and Skepticism Exposed!, I write:

Philosophers from the time of Socrates to the present day have been in the forefront of offering incisive criticisms of what most people instinctively believe. It was not that long ago that many philosophy teachers considered themselves the best equipped profession for teaching critical thinking to the next generation. That notion is no longer sustainable. Along with traditional epistemology, we must recognize that psychology (including social psychology, behavioral economics, and evolutionary psychology) plays a fundamental role in any attempt to guide ourselves or others in critical thinking.

Reading Daniel Kahneman's Thinking, Fast and Slow should put an end to any doubts about the importance of psychology to the development of critical thinking. Those who still think that logic and epistemology encompass all there is to critical thinking should read this book. If you are wondering why your logic students do so poorly on the Wason card problem immediately after you have given them the perfect lesson on modus ponens and modus tollens, Kahneman's accounts of numerous psychological studies on how people actually think might provide you with a clue.

We all know that intuition is a pretty good guide to use through life. We all know, for example, that we think much better when we're in a bad mood. What? You didn't know that. Well, you probably know that there is a growing body of evidence supporting the idea that when we're in a bad mood we lose touch with our intuition and that's why we think better when we're grouchy or wary. And surely you know that people are more creative when in a good mood, but they're also less vigilant and more prone to logical errors. You might not know, however, that experienced radiologists who evaluate chest X-rays as “normal” or “abnormal” contradict themselves 20% of the time when they see the same picture on separate occasions (p. 225).

If you have any doubts about the claims in the previous paragraph, I am sure you will not doubt this: "Following our intuitions is more natural, and somehow more pleasant, than acting against them" (p. 194). Unfortunately, there is a strong body of evidence that our natural way of thinking often fails us if accuracy and reasonable beliefs are what we desire.

I know that many readers will find it in bad taste to mention my own book in a review of another. I might be excused if the reader is made aware of the fact that I mention Kahneman's book in the last chapter of Unnatural Acts, "59+ Ways to Develop Your Unnatural Talents in Critical Thinking, Skepticism, and Science." There I write:

I don’t usually recommend books I haven’t read, but Daniel Kahneman’s Thinking, Fast and Slow was published as I was writing this last chapter. Kahneman is a Nobel Prize winner in economics and is known for identifying the many ways we are irrational in our judgments and decision making. I have relied on earlier work by Kahneman and his co-author Amos Tversky for several of the 59 items listed below. Whenever I think of Kahneman I think of the following test: “A bat and a ball cost $1.10 in total. The bat costs $1 more than the ball. How much does the ball cost?” If you understand the significance of this simple test, you understand the core message of Unnatural Acts.

My recommendation was not the result of having read Kahneman's book. It was intuitive rather than reflective. Intuition, however, is an ambiguous term. It can mean anything from the kind of gut feelings that drove George W. Bush to think he could see into the soul of Vladimir Putin and to send soldiers into battle to the quick thinking of a chess master whose decision to make a move is based on thousands of hours of playing the game. My intuition that Kahneman's book would be an excellent one for those looking to improve their critical thinking ability was based on experience, as many intuitions are that turn out to be right. Recognizing those times when intuition is not to be trusted is not always easy. But even when we do recognize one of those times, being able to overcome a natural way of thinking is not a foregone conclusion. As Kahneman observes:

Except for some effects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy [more of this below] as it was before I made a study of these issues. I have improved only in my ability to recognize situations in which errors are likely....And I have made much more progress in recognizing the errors of others than my own. (p. 417)

Kahneman's book reviews many studies that have exposed many of the biases in natural thinking and have discovered many new ones that you won't read about in any standard logic text. His book is for someone who already knows quite a bit about critical thinking and is interested in recent psychological studies on judgment and decision making. Granted, not all studies will be of equal interest. You may not care, for example, that "thinking about stabbing a coworker in the back leaves people more inclined to buy soap, disinfectant, or detergent than batteries, juice, or candy bars" (p. 56). But you might find some of the other studies on priming, representativeness bias, affect bias, hindsight, availability error, the anchoring effect, framing, the illusion of validity, the illusion of knowledge, etc., to be of interest.

My book is for beginners in critical thinking. It was written for that part of the general audience that wants to improve skills in critical thinking. There are no footnotes or references to arcane scientific studies. There are lots of examples of good and bad thinking. My book is about natural and unnatural thinking, and how the way we think affects everything we do. Natural thinking is instinctive, intuitive, quick and dirty. It works pretty well most of the time, but it can get us into trouble. We can deceive ourselves into believing what’s not true or even what goes against our own self-interest, if we’re not careful. And manipulators who understand natural thinking can use that understanding to hoodwink us into believing what isn’t true or doing what they want us to do. You can reduce the chances of being duped by learning how to think critically, i.e., in unnatural ways. Critical thinking is unnatural. It does not come easy. It requires effort. It can be frustrating. Natural thinking feels good and it is very difficult to make ourselves take seriously scientific studies that conflict with what we "just know is true" from experience.

One of the key pedagogical tools Kahneman uses is some terminology developed by Keith Stanovich and Richard West. They introduced the terms System 1 and System 2 (which they now call Type 1 and Type 2 processes) in an attempt to distinguish why some people make better judgments than others, i.e., are better thinkers. Obviously, you would not do this kind of study if you thought the answer was a simple matter of intelligence or knowledge. People of equal intelligence and knowledge don't always make equally good decisions. In Rationality and the Reflective Mind, Stanovich describes System 2 as "slow thinking and demanding computation" (Kahneman, pp. 48-49). Kahneman sticks to this idea in the title of his book. Fast thinking is the work of System 1. System 2, on the other hand, is reflective and takes more time and effort than System 1. (Kahneman uses these terms to indicate ways of processing experience and information, not as actual entities or parts of the brain.) For all intents and purposes, System 1 is the natural way of thinking for most of us most of the time. System 2 is unnatural in the sense that it is not instinctive, often makes us uneasy, and requires effort and conscious intent to overcome natural tendencies.

Conscious doubt is not in the repertoire of System 1; it requires maintaining incompatible interpretations in mind at the same time, which demands mental effort. Uncertainty and doubt are the domain of System 2. (p. 80)

System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy. Indeed, there is evidence that people are more likely to be influenced by empty persuasive messages, such as commercials, when they are tired and depleted. (p. 81)

To avoid misunderstand Kahneman's intentions, two points should be clarified. System 1 is often accurate and System 2 is often wrong, i.e., makes illogical or fallacious judgments. It is also worth citing a comment Kahneman makes near the end of his book, after he has reviewed dozens of ways that humans follow irrational rules in making judgments:

I often cringe when my work with Amos is credited with demonstrating that human choices are irrational, when in fact our research only showed that Humans [sic] are not well described by the rational-agent model.

Kahneman and Tversky "abandoned the idea that people are perfectly rational choosers," but this is not the same as saying that people are irrational. Somehow this point is harder to grasp than recognizing that a logician who describes the many fallacies humans can commit is not claiming that people are essentially illogical.

Kahneman is a psychologist but, as noted above, he won a Nobel in economics. He and Tversky developed a model of choice in 1979 that they called "prospect theory" as an alternative to the currently widely-popular model in economics known as "utility theory." Prospect theory tries to account for the irrational as well as the rational in human thinking. Kahneman writes:

Prospect theory turned out to be the most significant work we ever did, and our article is among the most often cited in the social sciences. Two years later, we published in Science an account of framing effects: the large changes of preferences that are sometimes caused by inconsequential variations in the wording of a choice problem. (pp. 271-272)

Kahneman discusses not only the strengths of prospect theory, but its weaknesses as well. He says that prospect theory can't account for the influence of two salient features of many choices: disappointment and regret. In any case, one of the ways in which we are irrational in making choices is in our tendency to be overly optimistic. This characterization of excessive optimism seems to be clearly true when it comes to the stock market, business, and politics.

Optimists are normally cheerful and happy, and therefore popular; they are resilient in adapting to failures and hardships, their chances of clinical depression are reduced, their immune system is stronger, they take better care of their health, they feel healthier than others and are in fact likely to live longer. A study of people who exaggerate their expected life span beyond actuarial predictions showed that they work longer hours, are more optimistic about their future income, are more likely to remarry after divorce (the classic “triumph of hope over experience”), and are more prone to bet on individual stocks. Of course, the blessings of optimism are offered only to individuals who are only mildly biased and who are able to “accentuate the positive” without losing track of reality. Optimistic individuals play a disproportionate role in shaping our lives. Their decisions make a difference; they are the inventors, the entrepreneurs, the political and military leaders—not average people. They got to where they are by seeking challenges and taking risks. They are talented and they have been lucky, almost certainly luckier than they acknowledge. (pp. 255-256)

....optimism is highly valued, socially and in the market; people and firms reward the providers of dangerously misleading information more than they reward truth tellers. (p. 262)

One of the consequences of excessive optimism is what Kahneman calls the planning fallacy: the tendency to assess projects by a best-case scenario rather than realistically. A realistic assessment of a project would include possibilities of failure and an attempt to find examples of similar projects that have already been tried. One of his examples is from personal experience where he and several others estimated it would take about two years to complete a particular project. In fact, similar projects had taken more than seven or eight years and many had ended in failure. Kahneman quit the group before the project was concluded, but several of his colleagues continued in their "irrational perseverance" to the bitter end, at which time the project was no longer needed.

Kahneman wonders: "Can overconfident optimism be overcome by training?" His conclusion: "I am not optimistic" (p. 264). In any case, he admits that not all time spent on failed projects due to excessive optimism is wasted. Excessive optimism encourages persistence in the face of obstacles. Other projects might emerge from the initial project. Projects that would never get started, much less completed, might occur and end up benefiting many. New skills may be learned or contacts might be made, leading to unforeseen and beneficial consequences. Even so, there is a valuable lesson to be learned here: "Consistent overweighting of improbable outcomes—a feature of intuitive decision making—eventually leads to inferior outcomes" (p. 321).

One of the more important qualities of critical thinking--and one that is rarely emphasized in logic or critical thinking texts--is the importance of giving proper weight to various pieces of relevant information. Yet, one of the most common failures of thinking is to overweight or underweight evidence. Kahneman cites several studies demonstrating that one of the characteristics of System 1 is overweighting low probabilities. This tendency is especially noticeable in loss aversion, our excessive concern with losses. We tend to overweigh small risks. On the other hand, we're usually willing to pay more than we should to eliminate risk.

Overweighting of small probabilities increases the attractiveness of both gambles and insurance policies.

The conclusion is straightforward: the decision weights that people assign to outcomes are not identical to the probabilities of these outcomes, contrary to the expectation principle. Improbable outcomes are overweighted—this is the possibility effect. Outcomes that are almost certain are underweighted relative to actual certainty. The expectation principle, by which values are weighted by their probability, is poor psychology. (p. 312)

Emotion and vividness influence fluency, availability, and judgments of probability—and thus account for our excessive response to the few rare events that we do not ignore. (p. 323)

Although overestimation and overweighting are distinct phenomena, the same psychological mechanisms are involved in both: focused attention, confirmation bias, and cognitive ease. (p. 324)

This asymmetry between the power of positive and negative expectations or experiences has an evolutionary history. Organisms that treat threats as more urgent than opportunities have a better chance to survive and reproduce. (p. 282)

I love this concept of cognitive ease and its contrary, cognitive strain. System 2 and critical thinking only kick in when there's a perceived problem. The problem is that System 1 not that good at recognizing that there's a problem unless it's obvious.

System 1 has been shaped by evolution to provide a continuous assessment of the main problems that an organism must solve to survive: How are things going? Is there a threat or a major opportunity? Is everything normal? Should I approach or avoid? The questions are perhaps less urgent for a human in a city environment than for a gazelle on the savannah, but we have inherited the neural mechanisms that evolved to provide ongoing assessments of threat level, and they have not been turned off. Situations are constantly evaluated as good or bad, requiring escape or permitting approach. Good mood and cognitive ease are the human equivalents of assessments of safety and familiarity. (pp. 90-91)

One example Kahneman gives of instant assessment is our ability to distinguish friend from foe at a glance. Alex Todorov, a colleague of Kahneman at Princeton, has explored the biological roots of rapid judgments about the safety of interacting with a stranger.

He showed that we are endowed with an ability to evaluate, in a single glance at a stranger’s face, two potentially crucial facts about that person: how dominant (and therefore potentially threatening) he is, and how trustworthy he is, whether his intentions are more likely to be friendly or hostile.

The main cues are the shape and expression of the face. If Todorov is right, one would think Mitt Romney wouldn't be having much trouble getting voters to support him. A strong, square chin is associated with competence, strength, and trustworthiness. "The faces that exude competence combine a strong chin with a slight confident-appearing smile." Political scientists following up on Todorov's work found these superficial cues work best "among politically uninformed voters who watch a great deal of television." The studies found similar results in Finland, England, Australia, Germany and Mexico .Kahneman comments:

...the effect of facial competence on voting is about three times larger for information-poor and TV-prone voters than for others who are better informed and watch less television. Evidently, the relative importance of System 1 in determining voting choices is not the same for all people.

Evidently.

Thinking, Fast and Slow is divided into five sections: Two Systems, Heuristics and Biases, Overconfidence, Choices, and Two Selves. There are thirty-eight chapters in all and two appendices, papers written with Tversky "Judgment Under Uncertainty" and "Choices, Values, and Frames." The book represents a lifetime of work in the psychology of judgment. How do we make decisions, good and bad? Why do we consistently make the same mistakes? How can we use what we know about how people think to manipulate them so that they will behave the way we want them to, especially when we want their money? [Note: this is not how Kahneman frames the question!] It is unfortunate, perhaps, but those who are asking the last question are probably the ones who will get the most out of books like Kahneman's. The rest of us can have good intentions about improving our thinking and avoiding the pitfalls of biases and fallacies, but the forces driving our judgments and beliefs are so strong and innate that overcoming them requires much more than good intentions. Without the knowledge of studies like those discussed by Kahneman, however, our chances of continuing to make errors in judgment will be significantly diminished.

The idea for which Kahneman and Tversky are best known in the world of critical thinking is their notion of heuristics of judgment. These are quick and dirty rules by which we make judgments that will inevitably lead to systematic errors. Judgments are disproportionately influenced by such things as salient events that catch our attention and rev up some memories like a political sex scandal, a dramatic event like a plane crash, and personal experiences like an unpleasant encounter with the local police. Systematically removing all the possible biases that might affect any particular judgment would be more tedious than most people would be willing to engage in. Instead, most of us most of the time pick and choose, usually instinctively, to consider only a few of the many facts or factors that should be considered when the goal is to make an accurate and reasonable judgment. In this book, Kahneman devotes nine chapters to heuristics and biases.

Many readers will probably be most interested in what Kahneman has to say about economics. I can tell you that what he has to say isn't pretty. People like to hear stories about how businesses rise and fall, especially if the stories provide "a simple message of triumph and failure that identifies clear causes and ignores the determinative power of luck and the inevitability of regression. These stories induce and maintain an illusion of understanding, imparting lessons of little enduring value to readers who are all too eager to believe them." He notes that Built to Last tells many such stories, but that the profitability and stock returns between the outstanding firms and the less successful firms that were studied shrank to almost nothing in the period following the study. Even knowing about regression to the mean and communal reinforcement doesn't seem to provide a powerful enough message to overcome the desire to believe that differences in skill and expertise can account for success or failure among corporations and economic forecasters. "A major industry [stock trading] appears to be built largely on an illusion of skill," says Kahneman (p. 212).

Finally, the illusions of validity and skill are supported by a powerful professional culture. We know that people can maintain an unshakable faith in any proposition, however absurd, when they are sustained by a community of like-minded believers. (p. 217)

A study done by researchers at Duke University indicates the depth of the problem of these illusions of skill and validity rampant in the corporate world.

For a number of years, professors at Duke University conducted a survey in which the chief financial officers of large corporations estimated the returns of the Standard & Poor’s index over the following year. The Duke scholars collected 11,600 such forecasts and examined their accuracy. The conclusion was straightforward: financial officers of large corporations had no clue about the short-term future of the stock market; the correlation between their estimates and the true value was slightly less than zero! When they said the market would go down, it was slightly more likely than not that it would go up. These findings are not surprising. The truly bad news is that the CFOs did not appear to know that their forecasts were worthless. (p. 261)

Kahneman wryly comments: "An unbiased appreciation of uncertainty is a cornerstone of rationality—but it is not what people and organizations want" (p. 263).

Both experts and we ordinary folks ought to learn a valuable lesson about the illusion of confidence in personal experience trumping impersonal scientific studies from the following case of a physician with intuitions about patients who were about to develop typhoid fever.

.... he tested his hunch by palpating the patient’s tongue, without washing his hands between patients. When patient after patient became ill, the physician developed a sense of clinical infallibility. His predictions were accurate—but not because he was exercising professional intuition! (p. 240).

Kahneman warns:

Claims for correct intuitions in an unpredictable situation are self-delusional at best, sometimes worse. In the absence of valid cues, intuitive “hits” are due either to luck or to lies. If you find this conclusion surprising, you still have a lingering belief that intuition is magic. Remember this rule: intuition cannot be trusted in the absence of stable regularities in the environment. (p. 241)

If I haven't convinced the reader by now of the value of Thinking, Fast and Slow, nothing I add will increase the probability of changing your mind. There are interesting examinations of causality and belief in gods and souls, unconscious influences ["experiments have confirmed Freudian insights about the role of symbols and metaphors in unconscious associations"], confusing experience with the memory of experience, theory-induced blindness, overconfidence, the effect of eating on judicial decisions and difficult cognitive reasoning in general, studies on well-being, how facial expressions affect thinking and vice-versa, and how unnatural most of us find thinking according to obviously beneficial rules (as in Bayesian reasoning).

I'll conclude this review with two salient quotes from Kahneman:

Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance. (p. 201)

...do not trust anyone—including yourself—to tell you how much you should trust their judgment. (p. 240)

Kahneman is one in a long line of thinkers that includes William James, Friedrich Nietzsche, and Sigmund Freud who makes us wonder what an ancient Greek philosopher would have made of the temple inscription "Know Thyself" had he known what we know or think we know.

Robert Todd Carroll
January 21, 2012

_____

More book reviews by R. T. Carroll

When you purchase something from Amazon.com through one of our links we earn a commission, which helps pay for the maintenance of this site.

 
This page was designed by Cristian Popa.