Can We Trust Our Intuition? An Interview with Dr. Daniel Kahneman

EducationPeopleScienceStories

Daniel Kahneman is the world’s pre-eminent investigator of the ways in which the limits of our cognitive abilities shape our judgments. Since the late 1960s, his work, much of which was carried out in close collaboration with the late Amos Tversky (1937–1996), has focused on our intuitive judgments. One of Kahneman and Tversky’s favored methods of studying such judgments involves asking subjects relatively simple questions about cases. An example is the famous “Linda Case,” in which subjects are given the following description of the protagonist:

Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student she was deeply concerned with issues of discrimination and social justice and also participated in antinuclear demonstrations.

Subjects were then given a list of eight possible outcomes describing Linda’s present employment and activities. Besides a number of miscellaneous possibilities (e.g., elementary school teacher, psychiatric social worker), this list included the descriptions “Linda is a bank teller” and “Linda is a bank teller active in the feminist movement.” Subjects were then asked to rank these descriptions by the probability that they were true. A large majority responded that Linda was less likely to be a bank teller than a bank teller active in the feminist movement. This is an obvious mistake, since it cannot be more likely for Linda to possess attributes X and Y than for her to possess attribute X.

The explanation Kahneman offered for the erroneous majority judgment was that respondents were implicitly using a heuristic — a mental short-cut — to arrive at their judgments. This particular heuristic involved replacing the attribute that was the target of the question (the relative probability of the description’s truth) with an attribute that comes more easily to mind (the relative resemblance of the description to the introductory statement about Linda). In other words, respondents were using the degree to which the description of her current activities resembled Linda as a quick way of judging the likelihood that the description was true.

The Linda Case illustrates several aspects of intuitive judgments: they typically spring to mind quickly and automatically, without much effort, and are difficult to control or modify, even in the face of conflicting evidence. Moreover, while the heuristics that give rise to such judgments may generally be useful — they economize on our mind’s scarce computing time and ability — they can also lead us astray.

Kahneman and Tversky also famously drew attention to another way in which our intuitive judgments may fail to conform to rational principles: Irrelevant variations in the description of alternatives can evoke a change in judgment, because each description elicits a different intuitive mental representation of the alternatives.

An example is their “Asian Disease Case”: imagine that the United States is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimates of the consequences of the programs are as follows:

  • If Program A is adopted, 200 people will be saved.
  • If Program B is adopted, there is a one-third probability that 600 people will be saved and a two-thirds probability that no people will be saved.

Which of the two programs would you favor?

In this version of the problem, a substantial majority of respondents favors Program A. Other respondents, however, received the same cover story followed by two differently described options:

  • If Program A* is adopted, 400 people will die.
  • If Program B* is adopted, there is a one-third probability that nobody will die and a two-thirds probability that 600 people will die.

Given these options, a clear majority favors Program B*. Of course, A and A* are identical, as are B and B*. Nonetheless, subjects are significantly more likely to choose the option in which 200 people will certainly be saved and 400 will certainly die over the risky option when the description draws attention to lives saved rather than to lives lost.

Kahneman and Tversky described the resulting difference in people’s preferences as a “framing effect”: Different ways of presenting the same decision problem elicit different responses, even though rationality requires the same pattern of response.

Kahneman and Tversky’s work on a variety of heuristics and framing effects, and their innovative theory of how people choose in risky situations, spawned a huge research program in psychology and economics, and earned Kahneman the Nobel Prize in economics in 2002. Since the mid-1980s, Kahneman and others have also investigated the use of heuristics and the existence of framing effects in moral judgments. This work is relevant to moral theory because one common procedure of moral inquiry is to employ the method of reflective equilibrium, which involves working back and forth among our intuitive judgments about particular cases and the principles we believe to govern them, revising any of these elements wherever necessary in order to achieve an acceptable coherence among them. We reach reflective equilibrium when our intuitive case judgments and moral principles are consistent with each other and some of these judgments and principles provide support for or provide a best explanation of others.

You May Also Like

5 Steps to Banish Disease From Your Mind
The Quest for an Alzheimer’s Cure: An Interview with Dr. Paul Coleman

Sponsored Link

About Us

A magazine dedicated to the brain.

We believe that neuroscience is the next great scientific frontier, and that advances in understanding the nature of the brain, consciousness, behavior, and health will transform human life in this century.

Education and Training

Newsletter Signup

Subscribe to our newsletter below and never miss the news.

Stay Connected

Pinterest