From Abracadabra to Zombies
The Wisdom of Not Thinking Too Much
We know that we have evolved to make quick decisions and that following our instincts has served the species pretty well, at least in terms of survival. The other entries in this blog have focused on the cognitive short cuts and logical fallacies that often accompany thinking that comes naturally. The focus has been on the importance of reflective thinking for making good judgments and coming to decisions that we won't regret. But there are times when a person will do better to stop thinking, stop reflecting, and to simply act. Not everyone arrives at this stage where the wise thing to do is to put critical thinking aside. Those who do have spent many years gaining knowledge, expertise, or performing ability. Their training, practice, and the skillful development of their talents have eliminated the need for reflection in order to do the right thing or make the right call. When it comes time to sing that aria before an adoring audience or swing at a 98 mph fastball in front of 50,000 baseball fanatics, thinking about what you are doing will hinder rather than help you succeed. When you have analyzed a problem to death in chemistry or physics, sometimes the best thing to do is to stop thinking about the problem and divert your attention to something else. There is no guarantee, but sometimes unconscious processes will provide you with the solution out of the blue. When an unexpected situation arises for which none of your years of training or experience has prepared you, following your instincts may be your best policy. All of these situations presuppose that you are extremely knowledgeable, have many years of experience, or have reached a performance level recognized as the highest level in your field. Herbert Simon, Nobel Prize winner in economics, put it this way: for the true expert, "intuition is nothing more than recognition." For the true expert, the situation provides cues and the cues give "the expert access to information stored in memory, and the information provides the answer" (quoted in Daniel Kahneman, Thinking, Fast and Slow, p. 11).
Experts in fields where reliable predictions occur with some regularity--such as physics, math, and chemistry--should be looked at differently than experts who make predictions in low-validity fields where long-term predictions are just guesswork because of the complexity of the system they are trying to master. Political and economic experts, for example, actually do worse than dart-throwing monkeys when it comes to making long-term predictions. (See Philip Tetlock, Expert Political Judgment: How Good Is It? How Can We Know? . Tetlock is a psychologist at the University of Pennsylvania who studied expert predictions over a twenty-year period.) The intuition of such experts is about as reliable as the intuition of the "average citizen" when asked to make long-term predictions about politics or the economy. It should go without saying that having high subjective confidence in one's knowledge or intuition is not a good sign of being accurate or wise.
People who are ignorant and have no experience and little talent but who follow their instincts are as likely to make bad decisions as stumble upon a good decision. But people who have vast amounts of knowledge, experience, or performing history should do little or no thinking while acting and should trust their instincts when working in their field of expertise. Outside their fields of expertise, of course, experts and talented artists are as vulnerable to the snares and lures of uncritical thinking as the rest of us.
There are also times when each of us should stop gathering more information to help us make a decision or judgment. Information overload can hinder our ability to make good judgments at times. Often we are better off making a decision by considering only a few obviously important factors rather than by introducing as many pertinent items as we can come up with. The more variables we bring into play, the greater our chances of giving more weight to minor items and less weight to important items. This point was made clear by Daniel Kahneman and Amos Tversky in experiments that showed giving people more information about a subject led them to poorer decisions. One example has become a classic. Subjects are told that Linda is "thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations." Then they are asked which of several statements they thought would be true of Linda. In test after test, subjects thought it more likely that Linda was a feminist bank teller than that she was a bank teller. There is a fundamental logical error here, which Kahneman and Tversy called theconjunction fallacy. (A conjunction is the joining of two statements with words like 'and' or 'but'.) It should be obvious that there is a greater probability of a single conjunct being true than there is of both conjuncts being true (Daniel Kahneman,Thinking, Fast and Slow, p. 156). These conjunction error studies have been replicated by Christopher Hsee and John List with different scenarios presented to test subjects but with identical results to Kahneman and Tversky.
Gathering more and more information can give one the illusion of understanding. American psychologist and philosopher Paul Meehl compared the predictions of trained counselors versus a simple algorithm that used just two or three variables and found that the simple programs were significantly more accurate in their predictions than the more complex programs of the experts. A typical test might involve trying to predict the grade point average for various freshmen at the end of the school year. A simple formula that looked only at high school GPA and the results of one standardized college entrance test were compared with the predictions of counselors who had interviewed each student for 45 minutes and also had access to the results of several standardized tests and a four-page personal statement from each student. In that study, the simple algorithm outperformed 79 percent of the experts. American economist Orley Ashenfelter did a similar experiment involving predicting prices for fine Bordeaux wines. He pitted the experts against a simple formula that considered only weather, average temperature over the summer growing season, the amount of rain at harvest-time, and the total rainfall during the previous winter. Ashenfelter's formula outperformed the world-renowned experts. (Ashenfelter's work is discussed in Daniel Kahneman, Thinking, Fast and Slow, p. 224ff.)
In matters of personal taste, the less information the better. Just drink the wine, taste the jam, let your feelings tell you which print you prefer. Don't be influenced by how much the wine costs. Don't get hung up on the various qualities one might list to distinguish different jams. Don't get too many details about the various prints you have to choose from. If the one you like is affordable to you, buy it no matter what your friends or the critics say.
In decisions that are more or less trivial in the big picture--this would include everything from buying a new pen to deciding where to go on vacation or what new couch to buy--the less information the better. We've all heard the expression "paralysis by analysis." When a decision is a minor one, the wisest path is often to focus on two or three important points, rather than drum up a list of every pro and con you can think of and then apply your list to dozens of possible choices.
In decisions that are monumental, such as the decision to send troops to fight in a foreign country or to take a loved one off life support, one should get as much information as possible from trustworthy sources who aren't likely to be biased. In such cases, we should consult with both those who are likely to think in ways we are likely to agree with and with those who are likely to disagree with us. Important decisions require diversity of input. In the end, the evidence may seem to weigh equally for going to war and not going to war or for taking a loved off life support and keeping a loved one on life support. You may have no choice but to rely on your gut feeling at that point. (Cf. William James's "The Will to Believe"). The only other alternative I can see is to take a vote among one's advisers or family members (or whatever group is relevant to the decision-making process) and go with whatever the majority thinks.
So, while wisdom requires devotion to critical thinking, it also requires knowing when to turn off critical thinking and rely on intuition, gut feeling, instinct, or whatever you choose to call that non-reflective preference percolating in our ever-fascinating brains.
Robert T. Carroll
February 4, 2013