Misinformation is a growing global problem, with false rumours having the potential to affect the course of elections or lead to violence. In new research Jens Koed Madsen looks at how we can ‘inoculate’ against misinformation by teaching people about how it can be created and spread. He finds that for these techniques to be effective, people must believe that misinformation affects them and believe that the source of misinformation training is trustworthy.
Misinformation threatens modern societies, affecting individuals, societies, and economies. It is a growing societal challenge with negative impacts on science, democracy, and public health. In early August in the UK the real-world impacts of misinformation could be seen in connection to the far-right riots, following the Southport murders. Here, false information was spread about the killer, which may have exacerbated the riots. In recent weeks, false rumours about Haitian migrants in Springfield, Ohio, quickly spread to the national stage to such an extent that former President Donald Trump repeated them during the presidential debate with Vice President Kamala Harris on September 10th.
However, while the role of misinformation in these cases is still to be properly determined, several academic studies show potential negative consequences. On an individual level, exposure to misinformation lowers compliance with public health guidelines and intentions to get vaccinated against COVID-19, which may compromise future herd immunity, and has led to mob lynching. On a societal level, misinformation can weaken the foundations of democratic societies, eroding trust in institutions, and creating an environment of political polarization, social unrest, and distrust in traditional media outlets. Misinformation undermines people’s confidence in the scientific consensus around climate change, and has been linked with Republicans believing the 2020 presidential election was stolen.
Given the potential harm of misinformation, it is hardly surprising that researchers have explored the role that psychological and behavioural science can have in understanding and mitigating the spread of it. As Professor Ulrike Hahn of Birkbeck University notes, there is a need to understand the information environment to bolster ‘the need to safeguard the accuracy of our information.’
Inoculation as a path toward misinformation mitigation
The ‘inoculation’ approach is one method which is designed to curb misinformation. The approach is inspired by a medical analogy where weakened doses of a disease trigger the body to make antibodies to fight off future infections. Inoculation theory believes that the same can be achieved with information – by pre-emptively exposing people to weakened doses of misinformation (or teaching people the way that misinformation can be created and spread via games like the Fake News game), people can build up psychological resistance against misinformation that they encounter in the future. In the Fake News game, your aim is to spread misinformation. Within the game, you learn that using techniques like emotion or polarising language gets better traction, thus familiarising yourself with the techniques. After people play the game, they are usually better at picking out headlines with those persuasive techniques.
The inoculation framework has been tested in many contexts: on misinformation about vaccines, climate change, and immigration. It can also be used to reduce the perceived reliability of polarizing voices, which is helpful in stemming their impact, as how sources are perceived can lead to the polarisation of people’s beliefs. Researchers have also explored technique-based inoculations, which seek to create psychological resistance against common techniques or tropes that underlie misinformation, rather than individual examples. This is akin to media literacy techniques that teach people how to think critically to improve their engagement with arguments in the future. It is often referred to as pre-bunking. By training critical thinking and recognition of cues, inoculation offers a potential path toward addressing the effects of misinformation without impeding people’s freedom of speech.
“Misinformation” (CC BY-ND 2.0) by 3dpete
Inoculation hesitancy
While the results of inoculation are promising, it is worth to consider practical challenges that may stand in the way of their impact. In a new research, we probe two potential barriers to rolling out inoculation training. First, we show that people tend to believe that misinformation is a problem for other people. That is, they believe it is a societal problem – just not their own personal problem. This is akin to the better-than-average effect, which is an illusory sense of superiority on a particular topic. For example, when asked to evaluate their driving skills, 93 percent of Americans believe they are better than the median driver, which is obviously not possible. For inoculation efforts to tackle misinformation such as media literacy and training interventions – to work, they require that people believe they have the need for training. If they believe misinformation is a problem for other people, they may be less likely to seek out or engage with training to mitigate it.
Second, we show that people say that they are less willing to engage with inoculation initiatives if they believe the source of the initiative is not trustworthy. This is in line with several reasoning studies that show people are concerned about the credibility and dependencies of sources when they evaluate evidence. This is a reasonable strategy – if you believe the source is unreliable or related to other information sources, you should treat the information differently than the source to be reliable and entirely independent. As such, we find similar attitudes toward inoculation training. This is challenging, as inoculation techniques are usually created by academics and scholars. If people who fall for misinformation distrust academics, this means that they may also be less likely to engage with solutions that are developed by academics.
More work is needed on misinformation inoculation interventions
Beyond individual hesitancy, it is worth considering how inoculation interventions function in a broader information ecosystem. Misinformation is not just a product of how we engage with information but is also impacted by elements such as the systems social media platforms use to recommend content, the media landscape more generally, and the social networks we inhabit. Only a small amount of research have explored how inoculation may spread through social networks and how this influences its efficiency. This calls for interdisciplinary work to understand how information travels in different ecosystems.
Our findings pose interesting challenges to the capacity to roll out inoculation. It indicates that training cannot be conceived in isolation of the subjective experiences of the people you are trying to reach and teach. More broadly, it is an invitation to engage more directly with people – to find out why they might hesitate to engage with training and media literacy interventions. This may require trust-building, community outreach, and bottom-up collaborative approaches. Inoculation is an important element in combatting misinformation without limiting debate, but there are important practical challenges to consider in rolling out these interventions.
- This article is based on the paper, ‘Inoculation hesitancy: an exploration of challenges in scaling inoculation theory’ in Royal Society Open Science.
- Please read our comments policy before commenting.
- Note: This article gives the views of the author, and not the position of USAPP – American Politics and Policy, nor the London School of Economics.
- Shortened URL for this post: https://wp.me/p3I2YF-eiH