Nobel Laureate Daniel Kahneman’s Intuition: Dead or Alive?

An Account of the 14th Annual Lynford Lecture at NYU-Poly
By Margaret Emory

How much is 2 + 2? Easy, right? Okay, how about 17 x 24? Uh oh, that’s a bit more challenging. In fact, right now you’re probably thinking, Should I really try to figure this out? How would I figure this out?, as your brain scrambles to remember the process for multiplying numbers in your head. Finally, ending with Ugh, it’s too difficult; forget about it.

What you’ve experienced are two systems of thinking, System 1 and System 2, respectively, which form the basis of Nobel Laureate Daniel Kahneman’s 2011 top-selling book, Thinking, Fast and Slow (Farrar, Straus and Giroux). System 1 is our fast, automatic, intuitive and largely unconscious mode of thinking. System 2 is our slow, deliberate, analytical mode. It is these methods of reasoning about the world that Professor Kahneman, senior scholar at the Woodrow Wilson School of Public and International Affairs, where he is also a professor of Psychology and Public Affairs Emeritus, addressed at the 14th annual Lynford Lecture, entitled “Intuition: Marvels and Flaws,” held on March 13, 2012 at the Polytechnic Institute of New York University’s Pfizer Auditorium in Brooklyn. Past lecturers have included Nobel laureate Myron Scholes and then-senator Hillary Clinton.

NYU-Poly president Jerry Hultin’s opening remarks described how young engineers and scientists at the school use Systems 1 and 2 daily to become the innovators and entrepreneurs of tomorrow. While the physical space of the MetroTech Commons is transitioning into a quad of activity with faculty and staff on all four sides, “incubators” are springing up throughout the city, providing 450 jobs and internships to students with a look toward life after college. The university is expanding globally in Abu Dhabi, for example, where the prestigious lecture was streaming live to students and faculty. Hultin says, “Great thinking, changing the world and exciting his students” is his mission at NYU-Poly.

Jeffrey Lynford, vice chair of NYU-Poly’s board of trustees, who, along with his wife Tondra, is a generous supporter of the lecture series, followed with a quote from Alice in Wonderland, saying, “Things keep getting curiouser and curiouser,” pointing to the “idosyncratic nature of human decision-making” with its biases of pessimism and optimism impacting our daily lives causing us to overestimate benefits and underestimate costs, referencing the multitude of bubbles that have plagued the financial markets throughout history. The stage was set perfectly for the master of irrationality himself, Daniel Kahneman.

A psychologist by training, Kahneman has never taken a course in economics; yet he was awarded the Nobel Prize in Economics in 2002. How did that happen? He says his work came about from two conversations. His late colleague Amos Tversky showed him a paper called “The Psychological Assumptions of Economics,” which posited that the agent of economic theory is rational, selfish and with tastes that do not change. To a psychologist, Kahneman said, this is astonishing. The notion that our beliefs and choices are logically connected, that we are internally coherent and consistent, without contradiction, is “preposterous,” and he has spent 45 years finding counter examples to prove its fallacy.

The other conversation had to do more with how the public views the notion of intuition. Kahneman says that the common belief is that intuition is magical. But what is intuition? According to the professor, it is knowing something without knowing that you know it. Kahneman feels we need to get rid of this idea that intuition is something magical and see it rather as recognition. A two-year-old points to an object and says, “Doggie”; he has no idea what it is but recognizes a pattern. Experienced chess players and physicians recognize patterns from experience. Kahneman gave a personal example. He can tell his wife’s mood on the telephone in one word. He’s had practice with many opportunities to learn. Plus, he has rapid and immediate feedback each time, because one word is followed by another and another, so his intuition is quickly corroborated or not. To Kahneman, there is no magic to intuition; it is another word for expertise. The caveat, however, is that it can only develop in a world that is regular. In an irregular world, you may feel like you have intuition, but you don’t.

Kahneman then addressed the financial industry, where it is clear that most people can’t guess about stocks without inside information. They feel they are experts, but they’re not. Some ideas come to our mind passively, such as the 2+2=4 example, through associative memory—links that connect events and stimuli with expected outcomes. Associate memory, the mode of System 1, does the fast thinking which generates coherent stories and causal thinking. It can also, unfortunately, produce fallacies which usually come about when we are asked difficult questions for which we don’t have answers. When this occurs, an answer to a related question comes to mind, which we use, and although it might feel like expertise, it is not. This is the substitution principle—answering the wrong question.

Kahneman gave an example from a study that was done 20 years ago, when there was a lot of terrorism in Europe. Americans were frightened of traveling to Europe. Travel insurance was being sold at the airport. A group of people in an experiment were asked to appraise two policies of travel insurance. One offered a settlement of $100,000 for death by any incident; the other offered $100,000 for death by terrorists. People put a much higher price on the second policy. What were the people doing? They were more afraid of dying in a terrorist incident than they were afraid of dying. But dying is dying, no matter how it happens. This example shows that we have intuitive responses to questions and are rarely stumped. However, our system delivers the answer to the wrong question, and that often happens under knowable conditions. There are biases that people don’t even know about which cause them to make mistakes with considerable confidence.

A question-and-answer period followed, moderated by Lynford with the addition of two engaging and entertaining interlocutors—Dr. Gillian Tett, author of Fool’s Gold: How Unrestrained Greed Corrupted a Dream, Shattered Markets, and Unleashed a Catastrophe and the US managing editor at the Financial Times; and Nassim Nicholas Taleb, author of The Black Swan. A derivatives trader turned author and professor, Taleb named himself as a practical, bottom-up kind of guy who knows that it is much easier to sell hamburgers that are 75 percent fat-free than 25 percent fat. He learned this from Kahneman when he was in the business of packaging insurance for portfolios. Some savvy strategies having to do with bias and poor memory based on Kahneman’s insights included,“If your P & L is down for the month, show the year to date and if the portfolio is up for the month, don’t show components.”

When Dr. Tett asked Kahneman whether healthy cognitive bias should be taught in school, Kahneman answered, somewhat facetiously, that even though he has been studying flaws of intuition for 45 years, he still makes mistakes. He did, however, acknowledge that developing intelligent gossip, the ability to see bias in other people, will help us to anticipate how other people will relate. Without a doubt, teaching about bias and the language of bias will enrich people’s “cognitive toolkit.” Knowing about anchoring areas and overconfidence would have a cumulative spread effect, he ventured, noting that without concepts and words, it is difficult to anticipate an intelligent reaction, because “nothing sticks.” This is where organizations particularly excel, because compared to individuals they are much better at adopting procedures which improve their decision-making processes.

Is there a place for statistics in all this? Kahneman doesn’t think so. People think causally, and statistical thinking is counterintuitive. When it really matters, he suggested that we let System 2 take over. “Occasionally, when you think you might be making a mistake, slowing down and asking for advice might be a good idea.”

For video of the full lecture, go to

Be the first to comment

Leave a Reply

Your email address will not be published.