LSE - Small Logo
LSE - Small Logo

Ian Plunkett

October 23rd, 2024

2024 Elections: Do New Disinformation Frontiers Threaten Global Democracy?

0 comments

Estimated reading time: 10 minutes

Ian Plunkett

October 23rd, 2024

2024 Elections: Do New Disinformation Frontiers Threaten Global Democracy?

0 comments

Estimated reading time: 10 minutes

As over a billion people head to the polls in 2024, the integrity of elections worldwide faces unprecedented challenges. In this guest blog, Ian Plunkett from Blue Owl Group, shares the findings of their recent on-the-ground studies of five pivotal elections in Taiwan, Indonesia, South Africa, Mexico, and the European Union.

Global Democracy: At a Crossroads

The backdrop to these elections is troubling. International IDEA’s 2023 report showed that half of the countries examined experienced declines in at least one indicator of democratic performance, marking the sixth consecutive year of this trend. With over 60 elections taking place over the course of this year, the stakes for global democracy have never been higher.

Disinformation Evolves: Beyond Candidate Attacks

During my time leading Global Policy Communications at Twitter from 2015-2021, I worked closely on emerging issues of election integrity, disinformation, and the changing nature of our information ecosystem. Our research uncovered a significant shift in disinformation tactics. Rather than merely targeting individual candidates, bad actors are now undermining the foundations of democracy itself — they sense blood in the water. In Taiwan, narratives portraying democracy as “messy, ineffective, and chaotic” gained traction, eroding public trust in democratic institutions. In Mexico, we observed how disinformation fueled a climate of fear, distrust, and real-world harm — over 130 incidents of violence against political candidates were recorded. 

The scale of this threat is substantial. In Taiwan, approximately one-third of observed disinformation campaigns had links to foreign actors, primarily China. The European Union saw a 40% increase in false narratives about election integrity compared to the previous election cycle. Climate change and the war in Ukraine were exploited and weaponized to expand the Overton Window in harmful ways. The culprits? In many cases, the political parties themselves are central to the spread — often emboldened by an exhausted public — particularly those on the furthest right and left edges of the political spectrum. 

How much did China influence Taiwan media?

AI Development: A Subtle Yet Powerful Influence

Contrary to some predictions, AI’s role in election manipulation has been more nuanced and muted than expected. That said, its use is growing. In Indonesia, Prabowo Subianto and Gibran Rakabuming Raka successfully deployed the AI-generated “gemoy” rebrand to boost their votes with Indonesian youth. The tactic essentially obscured the candidate’s hard military background and diverted young voters from his alleged human rights violation track records by portraying him as a cute and cuddly avatar. Prabowo’s character, in reality, is neither cute nor cuddly, and as a result this represents the start of a new form of information distortion. While we did not observe widespread AI abuse, it’s important to remember the speed and growth of this technology is without modern precedent. As a result, the tools used in 2025 will be exponentially more sophisticated than those deployed in an election context this year. 

Geopolitics: Impacting Vectors of Disinformation

Our analysis revealed a clear geopolitical divide in foreign interference. While Taiwan and the EU faced external disinformation campaigns, countries like Indonesia and South Africa experienced minimal foreign meddling. This pattern underscores the need for context-specific approaches to election security. It also begs the question of whether or not there are tactics that we’re just not quite aware of yet. The 2024 United States election — already seeing hacking operations from Iran and widespread disinformation campaigns from its own voter base — will prove to be hugely consequential litmus test. 

Platform Policies: Gaps in the Defense

Despite efforts by major social media platforms, significant vulnerabilities persist. In the EU, content moderation remains overwhelmingly English-centric, leaving millions of non-English speaking voters exposed to unchecked disinformation. Even more concerning, 87% of Meta’s counter-misinformation funds are allocated to English language cases, despite English speakers accounting for just 9% of global Facebook users. This dynamic has been exacerbated by the fact that many services have made content moderation a culture war issue, particularly X. 

Civil Society: Crucial Yet Underfunded

Across the five elections we analysed, civil society organisations emerged as vital defenders against disinformation. In Taiwan, fact-checking teams worked tirelessly to debunk falsehoods in real-time. The EU, particularly in the context of the recently implemented Digital Services Act (DSA), has seen strides forward, too, although with some legitimate concerns around freedom of expression and appropriate redlines. However, these organisations often operate with limited resources. For example, in South Africa, researchers struggled to access crucial platform data, hampering their ability to track hate speech and disinformation effectively.

Independent Media: A Vital Check on Disinformation

As Nobel Peace Prize winner Maria Ressa recently noted, “In a quicksand world constantly erupting in violence, independent journalism is a lifeline. It is the bedrock of democracy and equitable development.” OECD data tells us that the global print advertising market has plummeted by nearly 40% between 2019 and 2024, which has devastated news outlets. Meanwhile only 0.19% — just $500m — of total Official Development Assistance was allocated to supporting media and information in 2022. 

Shrinking revenue sources are  exacerbated by the sharp decline in the number of democracies around the world, and the steady rise of illiberalism over the past decade. In 2023 alone, 42 countries experienced democratic backsliding or a rise in autocratic systems of government, affecting 2.8 billion people — 35% of the global population. Our research supports this specific challenge. A trustworthy and dying media ecosystem permits disinformation to spread like a virus, far beyond our collective abilities to inoculate it. Democracy and independent media exist in symbiosis: as one fractures, so too does the other.

The Taiwan Model: A Beacon of Hope

Taiwan’s response to disinformation offers valuable lessons. Their “whole-of-society” approach, involving collaboration between government, tech companies, civil society, and the public, proved remarkably effective. When disinformation campaigns struck, the response was swift and coordinated: fact-checkers debunked rumours, the electoral commission countered false claims, and even influencers helped spread accurate information. It was both sophisticated in its scale and microscopic in its execution. More time must be spent learning from others in this regard. 

Urgent Call to Action

Based on our findings, we propose the following actions:

 

  • Governments must implement clear regulations on AI use in political campaigning, particularly regarding advertising, with appropriate oversight and enforcement mechanisms. 
  • Social media platforms should significantly increase investment in linguistically diverse content moderation in the context of election disinformation, aiming for more balanced enforcement across languages.
  • Governments must establish a robust funding mechanism for civil society organisations combating disinformation, recognizing their crucial role in defending democratic integrity.
  • Independent media, particularly those entities that historically have high integrity and high standards, must be appropriately bolstered to prevent market failure.

 

The fight against election disinformation is not just a technical challenge — it’s a critical battle for the future of democracy. Our research indicates that the current approach is insufficient to meet the evolving threat. As we look towards future elections, we must act decisively to strengthen our democratic processes against the tide of disinformation.

Yet, as we point fingers at bad actors and flawed systems, we must confront an uncomfortable truth: we, as a society, are complicit in this crisis of disinformation. Our collective tolerance for lies in public discourse has corroded the very foundations of our democracies. We consume media properties that twist facts, reward certain Internet platforms that prioritise division over truth, and elect leaders who shamelessly manipulate reality to suit their agendas. By cheaply giving away that most precious and innate of our faculties — our attention —  we’ve become architects of our own democratic decline.  

This tacit acceptance of dishonesty has profound implications — for elections, our climate, our communities, and how we govern. It’s given rise to a political landscape where truth is optional, where campaigns are built on online takedowns rather than policies, and where our strongman and strongwoman leaders increasingly resemble reality stars, rather than public servants. 

The harsh reality is this: until we, the citizens, demand and uphold a higher standard of truth in our public sphere, no amount of fact-checking or platform regulation will save us. The future of democracy doesn’t just hinge on combating external threats — it depends on our willingness to look in the mirror and change our own behaviour. And to demand better from those who serve us what we read and consume, and the platforms we use every day. The question is, are we brave enough to face this reflection and do the hard work of rebuilding a culture of truth?

Ian Plunkett graduated with an MSc in Political Science & Communications from LSE in 2014. He’s the CMO of Alder Renewables & a Senior Advisor at Blue Owl Group, which he helped to co-found with colleagues from his time leading Global Policy Communications at Twitter. 

[This article represents the view of the author and does not necessarily reflect the stance of Polis or the LSE}

 

About the author

Ian Plunkett

Ian Plunkett graduated with an MSc in Political Science & Communications from LSE in 2014. He’s the CMO of Alder Renewables & a Senior Advisor at Blue Owl Group, which he helped to co-found with colleagues from his time leading Global Policy Communications at Twitter.

Posted In: Media | Politics

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.