How much is 2 + 2? Easy, right? Okay, how about 17 x 24? Uh oh, that’s a bit more challenging. In fact, right now you’re probably thinking, Should I really try to figure this out? How would I figure this out?, as your brain scrambles to remember the process for multiplying numbers in your head. Finally, ending with ugh, it’s too difficult; forget about it.
What you’ve experienced are two systems of thinking, “System 1” and “System 2” respectively, which form the basis of Nobel laureate Daniel Kahneman’s book, “Thinking, Fast and Slow.” System 1 is our fast, automatic, intuitive, and largely unconscious mode of thinking. System 2 is our slow, deliberate, analytical mode. It is these methods of reasoning about the world that Kahneman, senior scholar at the Woodrow Wilson School of Public and International Affairs, where he is also a professor of psychology and public affairs emeritus, began his lecture “Intuition: Marvels and Flaws” at New York University.
A psychologist by training, Kahneman has never taken a course in economics; yet he was awarded the Nobel Prize in Economics in 2002. How did that happen? He says his work came about from two conversations. His late colleague Amos Tversky showed him a paper called “The Psychological Assumptions of Economics,” which posited that the agent of economic theory is rational, selfish, and with tastes that do not change. To a psychologist, Kahneman said, this is astonishing. The notion that our beliefs and choices are logically connected, that we are internally coherent and consistent, without contradiction, is “preposterous” — and he has spent 45 years finding counter examples to prove its fallacy.
The other conversation had to do more with how the public views the notion of intuition. Kahneman says that the common belief is that intuition is magical. But what is intuition? According to the professor, it is knowing something without knowing that you know it. Kahneman feels we need to get rid of this idea that intuition is something magical and see it rather as recognition. A two-year-old points to an object and says, “doggie”; he has no idea what it is but recognizes a pattern. Experienced chess players and physicians recognize patterns from experience. Kahneman gave a personal example. He can tell his wife’s mood on the telephone in one word. He’s had practice with many opportunities to learn. Plus, he has rapid and immediate feedback each time, because one word is followed by another and another, so his intuition is quickly corroborated or not. To Kahneman, there is no magic to intuition; it is another word for expertise. The caveat, however, is that it can only develop in a world that is regular. In an irregular world, you may feel like you have intuition, but you don’t.
Kahneman then addressed the financial industry, where it is clear that most people can’t guess about stocks without inside information. They feel they are experts, but they’re not. Some ideas come to our mind passively, such as the 2+2 = 4 example, through associative memory — links that connect events and stimuli with expected outcomes. Associate memory, the mode of System 1, does the fast thinking which generates coherent stories and causal thinking. It can also, unfortunately, produce fallacies which usually come about when we are asked difficult questions for which we don’t have answers. When this occurs, an answer to a related question comes to mind, which we use, and although it might feel like expertise, it is not. This is the substitution principle — answering the wrong question.
Kahneman gave an example from a study that was done 20 years ago, when there was a lot of terrorism in Europe. Americans were frightened of traveling to Europe. Travel insurance was being sold at the airport. A group of people in an experiment were asked to appraise two policies of travel insurance. One offered a settlement of $100,000 for death by any incident; the other offered $100,000 for death by terrorists. People put a much higher price on the second policy. What were the people doing? They were more afraid of dying in a terrorist incident than they were afraid of dying. But dying is dying, no matter how it happens. This example shows that we have intuitive responses to questions and are rarely stumped. However, our system delivers the answer to the wrong question, and that often happens under knowable conditions. There are biases that people don’t even know about which cause them to make mistakes with considerable confidence.
A Q&A period followed Kahneman’s lecture, moderated by Jeffrey Lynford, vice chair of NYU’s board of trustees, with the addition of two engaging and entertaining interlocutors: Dr. Gillian Tett, U.S. managing editor at the Financial Times and the author of “Fool’s Gold: How the Bold Dream of a Small Tribe at J.P. Morgan Was Corrupted by Wall Street Greed and Unleashed a Catastrophe”; and Nassim Nicholas Taleb, author of “The Black Swan: The Impact of the Highly Improbable.” A derivatives trader turned author and professor, Taleb named himself as a practical, bottom-up kind of guy who knows that it is much easier to sell hamburgers that are 75 percent fat-free than 25 percent fat. He learned this from Kahneman when he was in the business of packaging insurance for portfolios. Some savvy strategies having to do with bias and poor memory based on Kahneman’s insights included,“If your P & L is down for the month, show the year to date and if the portfolio is up for the month, don’t show components.”
When Tett asked Kahneman whether healthy cognitive bias should be taught in school, Kahneman answered, somewhat facetiously, that even though he has been studying flaws of intuition for 45 years, he still makes mistakes. He did, however, acknowledge that developing intelligent gossip, the ability to see bias in other people, will help us to anticipate how other people will relate. Without a doubt, teaching about bias and the language of bias will enrich people’s “cognitive toolkit.” Knowing about anchoring areas and overconfidence would have a cumulative spread effect, he ventured, noting that without concepts and words, it is difficult to anticipate an intelligent reaction, because “nothing sticks.” This is where organizations particularly excel, because compared to individuals they are much better at adopting procedures which improve their decision-making processes.
Is there a place for statistics in all this? Kahneman doesn’t think so. People think causally, and statistical thinking is counterintuitive. When it really matters, he suggested that we let System 2 take over. “Occasionally, when you think you might be making a mistake, slowing down and asking for advice might be a good idea.”
The video of the full lecture is available here.