Home 9 Uncategorized 9 The role of “illusory truth” in misinformation

The role of “illusory truth” in misinformation

by | Uncategorized

Building on the notion of truth bias, the illusory truth effect (ITE) demonstrates the crucialness of continuing to practise critical thinking. Truth bias is our predisposition to believe truth in others regardless of context, while the illusory truth effect (also known as the illusion of truth) describes the increasing effect of information the more times that we encounter it. ITE highlights that even when we know that a specific piece of information is false, repeatedly encountering this information diminishes our ability to identify the falsehood, and may even cause it to change our behaviours. The term illusory truth effect was first coined in 1977 by Hasher et al in their research paper about the accuracy of spotting true and false statement.

There are thousands of examples of this happening, but a common area of influence are the “old wives tales” for medical ailments. Despite there being no proven link between Vitamin C and the prevention of colds, millions are spent every year on Vitamin C gummies, tablets, and supplements, in a bid to reduce the infection rate of colds. In fact, Vitamin C is actually the preventative measure for Scurvy, but over the years, the myth has emerged that it has an impact on the common cold. During the COVID-19 pandemic, Donald Trump adopted the strategy of pedalling hydroxychloroquine as a beneficial drug for coronavirus treatment, before any links were proven. The result was that tens of thousands of patients requested the prescription from their clinician. Even now, when clinical trials are finding no link between the effectiveness of hydroxychloroquine and COVID-19, the belief that it is a miracle cure still persists.

The challenge with the illusory truth effect is exposure. Even the best critical thinkers are susceptible, due to our heuristic processing techniques. On average, our brains make 35,000 decisions in a single day, which means that the brain favours shortcuts to help make sense of the world and prevent us from becoming cognitively exhausted. These shortcuts may be based on past experience or problem-solving, but most commonly, it is about familiarity. Familiarity is a great technique for processing information faster, based on past experience, but it is this very familiarity that also enables misinformation to take hold. Each time we are exposed to a specific piece of information, even if we know it to be false, it starts to feel familiar, and it therefore becomes more entrenched in our psyches. Ultimately, people feel more positively towards things that they have encountered before.

To demonstrate familiarity, consider this problem:  if it takes 5 people 5 minutes to make 5 widgets, how long would it take 100 people to make 100 widgets? Most people will answer “100 minutes” very quickly, because the pattern is familiar 5-5-5 and 100-100-100, but in fact, when considered in more detail, the answer is actually 5 minutes or 100-100-5. Familiarity elicits the wrong answer very frequently.

Behavioural economist Daniel Kahneman has done a lot of research in this field, defining a split between “System 1” and “System 2” processing in the brain. System 1 is quick, automatic processing that works without our conscious awareness. System 2 represents the more complex processing that requires deeper thinking. Research shows that wherever possible, we prefer to rely on System 1, because it taxes our brains less, but it is here that the possibility of misinformation creeps in. Reasonably, other researchers have hypothesised that the division between System 1 and System 2 processing only really impacts in areas where we lack knowledge; i.e. we will dismiss misinformation if it is in an area that we already know something about and will then employ System 1. What has later been identified is that the illusory truth effect can impact even when people know the right answer, a phenomenon called knowledge neglect.

This is important in the fight against misinformation because the rate at which we are exposed to fake news stories is far higher than true stories. A study by Vosoughi et al (2018), highlighted that fake news stories reach people six times faster than true stories, and a study by Meyer (2018) highlighted that they are 70% more likely to be retweeted than the real stories. This makes it inevitable that we will encounter fake news several times a day, and the illusory truth effect means that it will have a greater impact on our thoughts than we want it to. What’s more, this assumes that fake news stories are just that, when in reality, there are a lot of propaganda machines relying exclusively on the fact that the more we are exposed to something, the more we will begin to believe it.

How to combat the illusory truth effect

What is clear from research is that we all know fake news exists, but one of the core reasons it pervades is that we each believe we will never fall for it. As a result, we often don’t realise that we have fallen prey to it, nor that we might also be perpetuating it. Our own biases and cognitive processing are to blame (see also confirmation bias, truth bias, and groupthink).

Ultimately, it comes back to critical thinking. Regular practise of critical thinking skills not only improves our ability to spot falsehoods, and reduce its effects, but also supports us in the moment when we encounter misinformation.

Categories