The Neuroscience of Rationalizing Our Mistakes

In 1997, members of the Heaven’s Gate cult prepared themselves for what they believed would be a spaceship coming on the tail of the Hale-Bopp comet to pick up true believers. Many of them engaged in a mass suicide to “shed their Earthly containers,” but there was a small problem — the spaceship never came.

The members of the cult who didn’t kill themselves returned to the store where they originally purchased an expensive telescope needed for spotting said flying saucer. Demanding a refund, they stated that they were able to find the comet but not the spaceship; therefore, there must have been something wrong with the telescope. Obviously.

Many of us have a hard time admitting when we’re wrong, but, even more so, we don’t like admitting that something about our logic (or belief systems) is flawed. So we rationalize, justify and sometimes fictionalize our stories, telling rose-colored lies to downplay our mistakes and make our choices and behaviors seem less faulty.

Rationalization — it’s what helps us sleep better at night.

Right about now, you’re probably shaking your head, thinking “I don’t do that,” but you do. No one is immune to self-justification to some degree, and that’s okay because recent research findings suggest that these behaviors aren’t entirely our fault. Our brains work in overdrive to preserve our self-image and support our attitudes, even when evidence indicates otherwise.

The mind reassures us, and because of this we often don’t realize that it is shaping our behavior. Mental stunts that take place when we rationalize result from cognitive dissonance, a term coined by the social psychologist Leon Festinger. Cognitive dissonance occurs whenever a person holds two conflicting ideas, beliefs or opinions, so we try to find ways to reduce it and let our minds rest easy.

For example, some people smoke two or more packs a day although they know that cigarettes are rather harmful. Many of them try to convince themselves that it’s not the case, with ingenious and self-deluding justifications: “Smoking isn’t really as harmful as people say” and “Smoking helps me relax and ward off stress — a health risk in itself.” That’s because dissonance makes us uncomfortable. Depending on the person, the side effects and their intensity can be experienced at different degrees.

Self-justification not only tries to make sense of our mistakes and bad decisions, it also allows us to blur the discrepancy between our actions and our moral convictions. As we use it to keep our self-esteem in balance, we become oblivious to the white lies and reassuring words we whisper to ourselves. In fact, research has proved that the brain has optical and psychological blind spots that enable us to invest in the delusion that we aren’t so delusional.

Brain MRI scans show that when we’re confronted with dissonant information and use rationalization to compensate, the reasoning areas of our brains essentially shut down while the emotion circuits of the brain light up with activity. In other words, emotions trump logic. Researchers have also concluded by this information that once our minds are made up, it’s hard to change them; even reading information that goes against our initial point of view only adds to the justifying that we were right.

To investigate cognitive dissonance, neuroscientists at the University of California, Davis, used functional magnetic resonance imaging (fMRI) to study the brains of volunteers who were made to experience the psychological pain of clashing beliefs and actions. Specifically, volunteers spent blocks of time inside the cramped fMRI tube while being distracted with a task. Afterward, they were asked to answer questions about their non-enjoyable experience.

Researchers wanted to induce cognitive dissonance, so the subjects were then asked to answer the questions again. The aim was to get them to state that they enjoyed the overall experience. Some of them were told that a nervous patient who needed reassurance would read their answers. Another group of participants were told that they would be paid $1 each time they answered the questions as though they were enjoying the scanner.

While faking it, two brain regions were particularly active in both groups: the dorsal anterior cingulate cortex (dACC) and the anterior insula. One of the functions of the dACC is to detect conflicts between incompatible bits of information, and it’s especially active when a person lies. The anterior insular is similar in that it monitors psychological conflicts such as a clash between stated beliefs and actual beliefs. The more that the volunteers lied in answering questions about their enjoyment of the fMRI experience, the more those regions lit up.

What’s interesting about this study is that when the participants were later debriefed about their actual attitudes toward the scanner, those who were asked to fake their enjoyment for the “worried patient” actually changed their true beliefs more than participants who were paid $1. In fact, those who faked their feelings about the scanner had greater activity in the dACC, and later said that they truly enjoyed it. Was it the brain activity that accompanied cognitive dissonance that had changed the participants’ minds about the experience of being in the fMRI? Or was it rationalization that increased the activity in this brain region?

Taking a different perspective on our original thoughts — via the technique used in cognitive-behavior therapy — can subsequently alter brain activity, which then makes us feel and think differently. The greater the cognitive dissonance people feel, the more likely they are to change their beliefs and streamline them with their actions.

Moreover, we rely on our memory to fill us in on what happened in the past, but, the truth is, it can and does become distorted in self-enhancing directions. It is often pruned and molded by our self-serving bias. We may gradually begin to think that a situation wasn’t entirely our fault or that it was too complex to handle properly. Before long, we can persuade ourselves to believe in an alternate version of what may have actually happened.

The distortion is necessary in order to keep our self-perception consistent. Dissonance theory predicts that we will eventually (and conveniently) forget good arguments made by opponents just as we forget silly arguments we made ourselves. It helps us disregard discrepant and selective information that doesn’t align with what we want to believe. It’s motivated by our need to be right, preserve self-esteem and make up for our mishaps, and as the self-serving spins of memory kick in and over time, we may forget or distort past events and gradually come to believe in our own lies.

Here’s the thing: No one is immune to the need to reduce dissonance, even those who know the theory inside and out. In reality, we’re all suckers for a happy ending, even if it’s one that we have to make up for ourselves or even piece together with tape. But the key is to become more aware of our thought processes so that we will be able to acknowledge serious mistakes instead of justifying them.

If we understand how and when to reduce dissonance, we can become more vigilant about the process and often nip it in the bud. By looking at our actions critically and dispassionately, we stand a chance of breaking out of the cycle of self-justification. We can learn to put a little space between what we feel and how we respond, insert a moment of reflection and hold on to a belief that is unfettered by facts. As a result, it can help us make sharper conscious choices instead of letting automatic, self-protective mechanisms resolve our discomfort in our favor.

Be the first to comment

Leave a Reply

Your email address will not be published.


*