Combatting the Spread of Disinformation: The Role of Behavioural Science

Article

The term “Fake news” has been increasingly prevalent in public discourse and has lead to an increasing awareness of the fact that people can’t simply believe everything they see or read online. During the COVID-19 pandemic, we saw that knowing what information and health advice to trust has proven to be a matter of life and death, thus the ability to distinguish fact from fiction is crucial. The question is: how can we know what to trust? By delving into the psychology of fake news we can learn how to combat disinformation. This article examines how behavioural science can be applied to make people and society more resilient to disinformation.

Looking into a person's mind, how to combat disinformation

What is fake news, and why is it a problem?

Fake news refers to information that is factually incorrect. When discussing “fake news”, we distinguish between disinformation and misinformation. Both misinformation and disinformation refer to information that is false. The key difference is the intention behind their creation and dissemination. Misinformation is false information that is spread without the intention to deceive. This type of false information is often the result of people sharing information without checking its accuracy or reliability. Disinformation, however, is deliberately created and spread with the intention to deceive people. This type of false information is often used as a tool for online manipulation, as those who create and spread it seek to exploit people’s vulnerabilities to influence their decision-making. While those who spread disinformation have the intention to mislead, those who spread misinformation do so without realising that the information they are sharing is false. For the remainder of the article, we will focus on disinformation specifically.

The problem with disinformation is two-fold. Firstly, it can cause harm when people believe incorrect information, as they might make decisions that are contrary to what’s best for their health and safety, their loved ones, or society as a whole. For example, consider how disinformation skewed people’s perceptions of health risks during the COVID-19 pandemic. Not only did this lead people to reject expert advice, but it also lowered compliance with public health guidelines and protective behaviours. COVID-19 is not the only area where disinformation poses a threat. Disinformation can have harmful effects in a variety of domains and contexts. For instance, there have been stories about childhood vaccinations causing autism, statements that deny the existence of climate change, and even claims that Barack Obama had been born outside of the US and was therefore not eligible for presidency. Important to note here is the “illusory truth effect”, which is a psychological phenomenon in which people are more likely to believe something is true if they have heard it multiple times, even if it is false.

Secondly, false information spreads quickly. Social media is designed to make it easy for people to share information. This ease of sharing has led to a behaviour called “blind sharing”, which refers to sharing information without reading it. A Columbia University study shows that almost 60% of links in Twitter retweets have never been clicked on before being shared, and 70% of Facebook users only read headlines of articles before commenting and sharing. These results show that critical thinking is not always a priority while using social media platforms. Disinformation is easily shared and hyped up on social media, leading to the creation of conspiracies and hoaxes. Fact checks of false information spread slower than false information itself. Moreover, studies show that correcting false information did little to prevent conspiracies from spreading further on social media platforms.

Debunking: Why a fact-check isn’t enough

What would you usually do when you realize someone’s beliefs are incorrect? Quite often, we try to correct the other person. We explain to them that they are wrong and why. If you show them enough proof that their beliefs are false, they will have to change their beliefs, right? Yet, research shows that simply providing evidence that information is incorrect – also known as “debunking” – is often not enough to convince people to change their beliefs. A challenge that we’re facing here stems from the “continued influence effect”, which means that fact-checking doesn’t fully put an end to the influences of disinformation, even though people might agree with the debunking messages. Why is that the case?

To understand events, people make “mental models”, including their thoughts and representation of what is happening in the world around them. For people to believe a story is true, they prefer the story to be logical, coherent, and complete. When parts of their mental model get debunked, people will be left with a gap in their perception of events. Subsequently, when asked about this event, people might use disinformation to make a coherent story. Summarily, though people agree with the fact-check, they may continue to use disinformation to make sense of the world. Another explanation of the continued influence effect can be found in how we remember things. For example, people might confuse disinformation with corrected information. To successfully replace disinformation, debunking messages should be detailed enough for people to fill the gap in their mental models and abandon the incorrect information entirely. They should therefore always include an alternative explanation for the situation that is plausible, detailed and well-argued.

Even if people do manage to correct their beliefs about whether the information is true or false, it is possible that their feelings about the topic remain the same. A study on false statements and voting intentions illustrates this. In the study, people were presented with true and false statements by Donald Trump, and they were asked to rate their beliefs in those statements. They were then presented with corrections of the false statements and confirmation of the correct ones. As a response, people corrected their belief ratings. You would assume that this also impacted their voting intentions. Yet, even though Trump supporters managed to update their beliefs about whether the information was true or false, their voting intentions and feelings towards him remained the same.

Thus, research shows that debunking messages spread slower than disinformation itself and that it’s a challenge to correct disinformation in hindsight. Moreover, debunking is often costly, labour-intensive and slow. Debunking can be effective under the right circumstances, but it has not been effective enough to solve the disinformation crisis thus far.

Prebunking: A vaccination against fake news

Debunking is a reactive strategy –correcting information after the damage has been done. More recently, researchers have started exploring more proactive strategies, leading to the use of “prebunking”. The aim of prebunking is to make people resilient against disinformation before they see, hear, or read about it.

While inoculation is promising, two main challenges limit both scalability and generalisability. First, much research has been done on the traditional, passive form of inoculation, which provides people with pre-emptive refutation. However, what received more attention lately, is active inoculation. Here, people are encouraged to come up with their own reasons to counter disinformation, as it is believed that creating your own arguments can create a stronger resistance effect than simply accepting arguments from others. Second, rather than inoculation against specific topics, it can also function as a more general approach – also referred to as “broad-spectrum vaccine” – to make people more resistant to disinformation. To do so, focus has shifted from providing pre-emptive examples of disinformation to examples of techniques used to fabricate false information, as well as making people aware of their own vulnerabilities and of the manipulative intent of others.

From theory to application: Interventions to make people more resilient to disinformation

Prebunking is based on the theory of inoculation, a psychological framework to develop resistance against persuasion and disinformation. The main idea is that a weakened dose of disinformation can help build resistance against future exposure to persuasive or disinformative messages – much like a vaccine against fake news. To do so, prebunking requires two core elements: a forewarning about potential disinformation techniques in the future, followed by examples of what disinformation could look like before exposure to actual disinformation occurs.

To boost people’s resilience to disinformation, prebunking has been used in developing interventions, among which the fake news game “Bad News”. The game simulates a social media environment, where players are exposed to weakened doses of disinformation and common techniques used to create disinformation. Studies have shown that such fake news inoculation games are effective in: a) improving people’s ability to spot disinformation, b) increasing people’s confidence in this ability, and c) decreasing the likelihood that people share disinformation with others. The game is fictional yet inspired by real-world events. It is a great example of an active, experiential and gamified application of inoculation theory, aimed at strengthening people’s resistance against disinformation.

What’s next?

There is an increasing need for ways to combat the spread of and belief in disinformation because of the abundance of fake news in the current information sphere. Fake news inoculation games have been proven effective in boosting resilience against online disinformation, even across various cultural, linguistic and political settings. Similar fake news inoculation games or “vaccinations” against disinformation should be developed and applied in domains where online disinformation poses a threat. Therefore, Tilt will continue to design innovative products and services that help people recognize (and deal with) online manipulation more effectively and with less effort.

Let’s talk

I’m interested in

Let’s talk

"*" indicates required fields

This field is hidden when viewing the form

Further reading

Want to read more about how to counteract disinformation? Here are some additional articles to explore:

–       How to Fight Fake News with Behavioural Science. [link]

–       The Good News About Bad News: Gamified Inoculation Boosts Confidence and Cognitive Immunity against Fake News. [link]

–       Countering Misinformation and Fake News Through Inoculation and Prebunking. [link]

–       Active Inoculation Boosts Attitudinal Resistance against Extremist Persuasion Techniques – A Novel Approach towards the Prevention of Violent Extremism. [link]