Critical thinking necessitates the ability to check the accuracy of information. The first obstacle to this is our tendency to accept information that agrees with our beliefs – which is called ‘confirmation bias’ – conversely, we tend to reject information that disagrees with our beliefs.
Confirmation bias makes it impossible to consider evidence objectively. If you try to reason anyone out of a belief, you will often be frustrated when your evidence is rejected out of hand. We cling to our beliefs and find it hard to put them aside for long enough to make reasoned decisions. It is to our own benefit to learn how to assess our own bias, if we are to escape the selfish reasoning of totalists, who are extremely good at manipulating confirmation bias.
In its advertising, New Scientist tells us that ‘9 out of 10 people hold a delusional view’. Confirmation bias often maintains such delusional views. Simply presenting evidence is not enough to change most people’s minds. Cognitive dissonance sets in once beliefs are challenged and it is easiest and most comfortable to just reject the evidence, rather than suffering the anxiety of dissonance.
So, the first step of critical thinking is the understanding that our views are not the be-all and end-all: we have to suspend belief in our beliefs when they are challenged, and take a more objective view.
When I left Scientology, way back in 1983, I put aside the whole subject with the determination that I would examine each aspect of this highly complicated (and endlessly verbose!) set of ideas and either accept or reject it. I realised that my thinking had been blinkered and directed by Scientology, so I looked to conflicting ideas.
For instance, Scientologists rely on Adelle Davis for nutritional advice, but when I checked her out, I found that she hired people to simply list scientific papers in her books, without bothering to read any of those papers. At best, her work is out-dated; at worst, it is dangerous. I read a book by a professor of nutrition, and the information there conflicted with much that Scientologists believe. The book was based upon experiment and observation and written by a qualified professional, rather than being the half-baked opinions of a self-made guru.
In time, I ended up rejecting all of the significant ideas of Scientology and replacing them with evidence-based ideas that had been meticulously tested (despite claims to science there is no single study supporting any of Hubbard’s ‘technology’).
It was immensely liberating to reconsider my basic assumptions about the world, and I have
continued to challenge even the most established of my beliefs. In fact, whenever I feel convinced of any idea, I seek out contradiction.
Life is much more fun when you aren’t pinned down by dogma, and I’m sure that a fluid mind will stay healthy far longer than a rigid and bigoted one.
I learned to challenge my own confirmation bias. Of course, the natural reaction when challenged is to defend, but it is much better to count to ten, put aside hurt feelings and think about the complaint.
In practical terms, we should be open to evidence even when it disagrees with our beliefs, our intuition or our feelings. This does not mean that we should simply accept everything we’re told and shift our opinions accordingly, but that we need to be aware of confirmation bias, so that we can more fully consider the evidence.
We should also be aware of our beliefs, our intuition and our feelings but not evade reason because of them and measure them against the evidence. As Etienne de la Boétie put it, ‘we should adopt reason as our guide and become slaves to nobody.’
What do you think about this article? Do you agree? Have you read Jon’s new book? Do you have a story about unquestionable assumptions that you’d like to share? We’d love to hear from you!