Home 9 Critical Thinking Featured 9 Why we believe in truth and honesty, even when we shouldn’t

Why we believe in truth and honesty, even when we shouldn’t

by | Critical Thinking Featured, Uncategorized

There is a phenomenon called “truth bias” which leads us to believe people are telling us the truth, even when we have some evidence to the contrary. Sure, there are people at our margins that we distrust or ignore, perhaps because they have given us reason to distrust them, or because they belong to a particular group or part of society, but on balance, we (humans) will believe people’s lies, simply because we have a natural bias to assume honesty and truth.

Truth bias was coined in 1984 by McCornack and Parks, during their development of their Model of Deception Detection. They identified that humans have a general tendency to judge the communication of others as truthful. The theory has long been expanded to identify key characteristics of truth bias, and has highlighted that it is a social default, in that we forgive minor discrepancies in stories, exaggerations, and potentially even lies, because relationships would become strained if we constantly questioned the integrity and veracity of what people say.

Truth bias, which is more scientifically referred to as Meta-Cognitive Myopia, highlights the very real problem of the mental separation between “primary information” and “meta information”. Primary information is likely to be the first, or most influential time that we hear a statement, which then becomes seared in our mind as the baseline for our thinking. Meta information is the cues that surround the primary information, such as the context in which the information is offered, the person who offered it to you, or even whether you are being explicitly shown that something is a lie. You’ll hear this meta-information, and you might even retain some of it, but your brain will still assign significant weight to the primary information and assign very little weight to the meta information. This is one of the reasons that lobbying is so powerful; for example fossil energy companies denying the climate crisis is not dismissed out of hand, because they provide primary “evidence” that supports their theory. Few stop to question the vested interest these companies have in continued use of gas and oil, and those that do are often considered radical or extreme.

Truth Default Theory (TDT) is an extension of truth bias, which looks specifically at truth bias in the context of communication. It assumes that we live in a truth-default state, in which we presume others to be honest during communication, either because we don’t think of deception as a possibility during communication, or we don’t identify sufficient information to highlight that we are being deceived. This has very real consequences. During the early days of the COVID-19 pandemic, initial statistics highlighted that people over 50 were more likely to die than people under 50. We were not told that people under 50 would not die, and there were plenty of cases of people under 50 who did die, but still, the average person under the age of 50 had an extra feeling of security that perhaps influenced their behaviours. Similarly, during the Clinton-Trump Presidential race, Clinton was subjected to a smear campaign that resulted in the circulation of plenty of false statements. Even when those statements were publicly decried and demonstrated as false, the bias against Clinton stuck and the smear campaign kept working. The phenomenon becomes worse when combined with groupthink and confirmation bias. It’s what leads people to believe truly outrageous conspiracy theories, or even just to discount science in favour of opinion. It even goes as far as affecting public policy, when the bias of the individuals in power, or the impact of lobbying on those people, leads to a skewed legislation in favour of one group or another.

Right now, we could sit here and make up a plausible statement and even if we quickly discredited and demonstrated the falsehood of that statement, it will stick in the mind, and will potentially even form the basis of your future judgements. Humans can’t help ourselves. Even when we know or are shown something isn’t true, a small part of our brain will harbour the belief that there must be some truth to it, and it will continue to have influence. Evolution and survival play some part in the equation. We still have an innate survival instinct that encourages certain behaviours, despite advancing well beyond our cave-dwelling ancestors. Think about it. If your friends and family were running from a predator, would you stop to consider if the predator was real, or if there even was a predator? No. You’d definitely start running too. This primitive survival instinct plays its role in truth bias, causing us to prioritise primary information (threat = run) and discount meta information (can we see the predator, hear the predator, smell the predator? etc.). The problem is society has developed and changed, and these base reactions don’t accommodate the nuances of societal living.

As if truth bias wasn’t a concerning enough phenomenon on its own, we must also add the challenge of memory. Memory is notoriously fallible, but it is also believed to be largely infallible. Whenever we see, hear, or experience something, we commit it to memory. We then believe that memory to be true, discounting the fact that:

  • a) it is full of holes and our brain fills in the blanks,
  • b) memories degrade over time,
  • c) memories are open to suggestion that can change how we remember something, and
  • d) there is no such thing as a perfect memory: we are remembering our own experience which we have viewed through the lens of our own bias.

Despite these obvious flaws, we rely on memory in many circumstances with very real consequences. From how we tell a story about an event to recounting memories as evidence in court trials, memory is used in communication, and is very definitely tainted by truth bias somewhere along the way.

A study by Pantazi (Oxford University), Klein and Kissine (Free University of Brussels) highlighted just how powerful truth bias is. They led a study with a group of participants acting as “jurors”. They presented the participants with information pertaining to two criminal defendants, along with information on the sentences that the defendants were given, before asking them to come up with an appropriate prison term, and determine how dangerous each defendant was. Participants were also told that some of the information they’d been given was not true, and were even told exactly which pieces of information weren’t true. The question that Pantazi et al. wanted to answer was whether participants would appropriately discount the false information, or if it would influence their decisions. Ultimately, the study found that when people received negative information about the defendant, they were influenced by it, even when they had been explicitly told it was not true. Even more strikingly, Pantazi et al. were able to quantify the impact of memory, identifying that participants tended to misremember false evidence as being true.

Later, Pantazi et al. repeated the experiment with professional judges and drew the same basic conclusions. Experienced judges, who sit in judgement of individuals and receive information that may well affect their conclusions, struggled to remember falsehoods as false, and didn’t exhibit an ability to discount falsehoods during sentencing. It led Pantazi et al. to draw additional conclusions in that negative information in particular is more difficult to forget or discount.
For those who have ill intentions, truth bias creates space for exploitation, as it reduces an individual’s tendency to identify the possibility of deception. It is the same reason that advertising works, that magicians can perform their tricks, and that spin doctors are able to offer plausible stories to the masses.

Being so embedded in our psyche, truth bias is difficult to combat, but critical thinking skills help to diminish its effects and prevent falsehoods becoming entrenched in your thinking. Encouraging your inner cynic, being aware of patterns of behaviour, like who has the most influence over you, or who is most likely to exaggerate or bend the truth, and training yourself to be alert to the red flags can all help combat your truth bias. Ultimately, question what you are told and practise your critical thinking, always.

Categories