LSE - Small Logo
LSE - Small Logo

Ozan Kuru

January 20th, 2025

Are misinformation interventions different or fundamentally the same?

1 comment | 6 shares

Estimated reading time: 7 minutes

Ozan Kuru

January 20th, 2025

Are misinformation interventions different or fundamentally the same?

1 comment | 6 shares

Estimated reading time: 7 minutes

The growing field of misinformation research has spawned numerous conflicting interventions aimed at minimising its impact. Ozan Kuru argues that by taking a critical perspective on these interventions and their core similarities, the field can overcome disciplinary differences to create a more consistent approach to misinformation interventions.


Over the past decade and particularly the past five years, misinformation and interventions aimed at mitigating it have received extensive scholarly attention. However, these interventions face significant epistemological, empirical, practical, socio-political, and ethical challenges.

At the same time the field of misinformation research is fragmented across disciplines, theories, methods, and funding schemes. This multidisciplinary nature is a strength, but only to the extent that competition and coordination are at a healthy balance.

This multidisciplinary nature is a strength, but only to the extent that competition and coordination are at a healthy balance.

Consequently, there is often an overemphasis on surface-level differences between interventions and relatively less attention to their underlying similarities and differences. This raises the question, if whether by engaging in more “critical coordination” and paying more attention to the underlying components of interventions these differing approaches can be reconciled? If so, this could pave the road for more effective evidence accumulation and interpretation, practical implementations, and address critical equity and ethics issues.

The Story Becoming the Story?

In addition to many studies investigating the effects of individual interventions, large-scale studies have compared numerous interventions simultaneously. These range from fact-checking, accuracy prompts and nudges to misinformation warnings, media literacy tips, inoculation, and various others. The diversity and nature of interventions raise questions about how far different interventions differ, for example:

  1. What is the difference between “literacy training” and “psychological inoculation” interventions? Do we have to label literacy interventions that include motivational triggers about a threat as “psychological inoculation”?
  2. Similarly, echoing the critiques of the blanket use of the term “nudge,” how should we distinguish between “accuracy prompts,” “accuracy reminders,” and “accuracy nudges”? Is this merely an interchangeable use of terms or something overlooked that needs reconsideration?
  3. How do “media literacy tips” differ from “accuracy prompts” if “media literacy tips” embedded in social media can include “accuracy prompts”?
  4. Also, “lateral reading and verification strategies” can take the form of “media literacy tips” and can be achieved by technique-based “psychological inoculation.”
  5. Finally, should we consider “literacy training” and “literacy tips” (or “different types of news tips”) as distinct types of interventions when the most notable difference seems to be the dosage/intensity of intervention?

Being more critical about the competing research lines, teams, and theories is important given the rising asymmetry in the field where some theories and interventions, such as inoculation, receive extensive focus and funding, while these popular interventions’ distinctions from other existing strategies are not fully articulated. This reflects a wider issue in evidence-based policy and practice, which Farley-Ripple, Oliver, and Boaz characterised compellingly:

“We use different language to talk about what we do and how, and we promote our work in different spaces. We may replicate each other’s work, or solve the same problems over and over again, seldom realizing that we are working in parallel.”

There is also the issue of “the story becoming the story,” as Gelman identified in the context of behavioural economics. This refers to how compelling storytelling and appealing narratives of some interventions, such as power posing and nudges, with their analogies and metaphors, may explain their popularity and “success story” compared to other comparable interventions.

Recent work shows that the general public is confused and not knowledgeable about misinformation interventions, find them politically biased

Misinformation interventions also face severe challenges in public perceptions (their intended audience). Recent work shows that the general public is confused and not knowledgeable about misinformation interventions, find them politically biased, and under-represented and historically-marginalized communities distrust various interventions involving authorities.

The lack of critical coordination therefore both reflects and exasperates these issues by hindering conceptual clarity, theoretical progress, evidence accumulation, translational efforts, and ultimately public engagement.

Critical coordination

How can we engage in critical coordination? A first step would be comparing the interventions’ underlying similarities and differences instead of more surface-level elements (such as analogies and theoretical stories). This can help delineate individual interventions. Such theoretical cooperation and epistemological pluralism could also help collapse academic silos and create a shared basis for collecting evidence and sharing practical insights.

In a recent study, I offered a framework to compare the interventions with a three-component strategy: 1) Identify and critically interrogate underlying commonalities and differences, 2) test the underlying components individually by keeping the dosage of the shared components as equivalent as possible without being concerned about the surface-level characteristics, 3) test conditioning factors (or mediation pathways) that specifically target those underlying components to investigate their independent roles.

The study compared literacy training and inoculation interventions by focusing on their informational (knowledge) and motivational (forewarning about misinformation) components. The findings showed the benefit of forewarning in recognising misleading health statistics and not fueling cynicism towards accurate information. By focusing on the underlying components that lead to differences, we can better explain why the effectiveness of these two interventions differs. Thus, we can see how far the most popular intervention (inoculation), which draws extensive funding and publications, differs from similar efforts like literacy education.

To facilitate this approach further, synthesising insights from our study and the other categorisations mentioned earlier, we offer a list of underlying dimensions and design elements (Table 1).

This table provides example key dimensions under six broader categories and a few central guiding questions for each category. Rather than an exhaustive checklist this table can serve as a guide. Scholars may prefer to utilise or re-arrange it in various ways based on priorities and specific research contexts.

Table 1. A Categorization of Dimensions in Misinformation Interventions
Notes. While most dimensions are presented as a dichotomy, they should be considered as the extreme ends of a continuum. The categorization is based on a review of studies included in recent systematic reviews and meta-analyses [Huang et al. (2024); Lu et al. (2023); Hartwig et al. (2024); Droog et al. (2024)] as well as comparative studies on misinformation interventions [(Arechar et al. (2023); Fazio et al. (2024); Hoes et al. (2024); Kozyreva et al. (2023); Spampatti et al. (2024)].

Why critical coordination is important

More effective interventions

Critical coordination can bring more conceptual clarification and distinctions, which can, in turn, help improve the accumulation of evidence, understand inconsistency or (non)replication of findings, and diversify and enable more combined interventions that bridge individual level strategies, individual and social/policy-related strategies, quantitative and qualitative insights, and theory-based and practice-based insights in designing interventions.

Improving translational and practical work

Critical coordination can help practitioners to navigate the already extensive evidence on misinformation, and help reduce public and practitioner confusion. For instance, some scholars called for building shared metrics of evaluations to assess interventions, such as “evidence readiness levels” or “pyramid of consecutive criteria,” to make implementation planning more systematic.

The work by the Prosocial Design Network is one example of such careful translational efforts. These could be further supplemented by conducting more expert surveys to investigate scientific consensus and dissensus and coordinating searchable misinformation intervention databases. However, these surveys and databases must have input from diverse studies and researchers. These efforts will ultimately result in better communication with practitioners and better use of limited resources.

Supporting equity and ethics

Critical coordination can also bring more attention to equity and ethics in designing and implementing interventions. A critical evaluation of the underlying components of interventions and interrogating the positionality among implementers, interventions, and the target audiences will center the views of under-represented communities more and promote transparency in intervention funding and procedures.

For instance, when the interventions funded by Global North governments are implemented in the Global South or when interventions receive funding or data provision from tech companies; positionality regarding historical and current asymmetric relationships (colonialism, military invasions, data ownership and privacy) and algorithmic transparency should be interrogated. Studies should have clear transparency and funding disclosures and engage with multiple stakeholders by diversifying the research teams and provide more equitable research funding opportunities. Awareness of critical approaches to the interventions’ assumptions, context, and agency is crucial to building trust with the public.


This post draws on the author’s paper, Literacy training vs. psychological inoculation? Explicating and comparing the effects of predominantly informational and predominantly motivational interventions on the processing of health statistics, published in Journal of Communication.

The content generated on this blog is for information purposes only. This Article gives the views and opinions of the authors and does not reflect the views and opinions of the Impact of Social Science blog (the blog), nor of the London School of Economics and Political Science. Please review our comments policy if you have any concerns on posting a comment below.

Image credit: LSE Impact Blog.


Print Friendly, PDF & Email

About the author

Ozan Kuru

Dr. Ozan Kuru is an assistant professor at the Department of Communications and New Media, a principal investigator at the Centre for Trusted Internet and Community at the National University of Singapore, and an affiliate at the International Panel on the Information Environment.

Posted In: Academic communication | Experts and Expertise | Featured

1 Comments