Cognitive dissonance is significantly related to undue influence. When our beliefs are challenged, cognitive dissonance describes the uneasiness we feel, and the way in which we reject information without properly considering it.

Clever manipulators keep us from changing our minds – despite the evidence – by playing upon that uneasiness. They shut out any challenge by reinforcing entrenched beliefs. So it is that businesses fail, that cults keep their members, and that empires fall. Cognitive dissonance also explains the slow progress of science – why it takes a generation for an enshrined paradigm to shift and for evidence to overcome “common sense”.

cognitive dissonanceSince Leon Festinger first suggested cognitive dissonance in the early 1950s, it has been studied rigorously. Indeed, it may be the most investigated theory in social psychology, with literally thousands of studies to support it.

We all feel uncomfortable when our ideas, our beliefs or our behaviors are challenged. Festinger showed that even the strongest and most obvious evidence will be dismissed, if it disagrees with our core beliefs. We have a natural confirmation bias, so we favor evidence that supports our beliefs at the expense of reason.

As a youngster, I loved to debate ideas. I read Plato’s Symposium in my early teens, and delighted in the idea that logic could be teased out through discussion and friendly questioning. It took me a while to understand how upsetting this is to many people: they want to hold on to their certainties, and they do not welcome any contradiction.

The truth is that we don’t like to be disagreed with, and we’ll use all sorts of tricks to bat away anything disagreeable. But to become truly rational, we must learn to accept (and even appreciate) disagreement. In 1859, John Stuart Mill wrote the textbook On Liberty, where he said, in the grand style of the times:

“The steady habit of correcting and completing his own opinion by collating it with those of others, so far from causing doubt and hesitation … is the only stable foundation for a just reliance on it [his opinion]; for, being cognizant of all that can … be said against him, and having taken up his position against all gainsayers – knowing that he has sought for objections and difficulties instead of avoiding them, and has shut out no light which can be thrown upon the subject from any quarter – he has a right to think his judgement better than that of any person, or any multitude, who have not gone through similar process.”

At the age of 17, when street recruiters asked me if I’d heard about Jesus or Krishna, I’d hear them out, and talk with them about their views. A born-again school teacher spent two hours on a sunny afternoon trying to convince me that I should join his church. Eventually, he backed away – I mean he literally walked backwards – saying, “I don’t understand the Bible, but I know it’s all true!”

This phrase has served me well for a lifetime, because the contradiction told me just how fervently we can ignore logic, sense and reason when we are filled with the belief in our “knowledge”.

This “knowledge” is the “sense of certainty” that William James called “noetic”. It bedevils the reasoning of even the most intelligent person. Certainty should only be based upon demonstrable proof. All too often it is based on feeling alone: “I know it is right!” all too often means, “I believe it is right.”

Many years ago, while involved in Scientology, I read three hostile books, and was happy to talk with critics (and managed to convince two of them to take courses). But once I realized that Ron Hubbard, the group’s founder, had frequently contradicted himself in his autobiographical accounts, I knew that he could not be trusted.

For instance, in a lecture given in September 1950, Hubbard admitted that he had failed a course in “atomic and molecular phenomena” (his university grade sheets agree), but by the 1960s, he was claiming to be a “nuclear physicist”.

I was surprised that very few of my fellow believers agreed when I offered them the facts. Hubbard was a fabulist on a grand scale – not a nuclear physicist, or a war hero, or the student of gurus, as he claimed in some accounts (and refuted in others).

How could a liar discover the “road to truth”? A man who said, “The road to truth must be taken with true steps”? Why should I trust a man who said, “Honesty is sanity”, and then lied his head off?

As the years have passed, I’ve realized that I enjoy the challenge of disagreement, as long as the disagreement is agreeable: I don’t want to be shouted at or insulted, but I’m very happy to listen to evidence that may change my beliefs – at least to the point where it is palpable nonsense. I try to follow the advice of economist John Maynard Keynes, who said, “When the facts change, I change my mind.”

I embrace dissonance, because otherwise I’d be impervious to reason. Very few of the beliefs that the 17-year-old me held remain intact. I hope that this will continue to be the case going forward: that my beliefs will continue to evolve. Change is good, if change is sensible, and the ability to change is a sound defense against undue influence and manipulation.

What do you think about this article? Do you agree? Have you read Jon’s book? Do you have a story about cognitive dissonance that you’d like to share? We’d love to hear from you!