LSE - Small Logo
LSE - Small Logo

Junyan Zhu

Rachel Isaacs

May 29th, 2024

Campaign microtargeting and AI can jeopardize democracy

0 comments | 8 shares

Estimated reading time: 7 minutes

Junyan Zhu

Rachel Isaacs

May 29th, 2024

Campaign microtargeting and AI can jeopardize democracy

0 comments | 8 shares

Estimated reading time: 7 minutes

Online political campaign microtargeting is likely to be a big part of this general election campaign. Generative AI could also potentially be thrown into the mix, with the risk of fake audio recordings and images misleading voters. Junyan Zhu and Rachel Isaacs argue that given these campaigning methods risk undermining democratic accountability, stricter regulation that enhances the transparency of online campaigns is needed.


In recent elections, political parties and campaigners have used social media platforms to deliver highly targeted political messaging. Online political advertising has become a significant aspect of political campaigning over the past decade. This kind of microtargeting might have intensified as a result of the Data Protection and Digital Information Bill which aimed to loosen the campaigning laws for political parties using personal data to target voters. Coupled with an increase in campaign spending limits, and the introduction of AI in political campaigning, putting online misinformation on steroids, there are serious concerns over the impact all this will have on campaign transparency and democratic accountability. With political campaigns inevitably becoming more technology-driven, we believe that evidence-based and timely policymaking is indispensable, especially in understanding the detrimental impacts of micro-targeting and AI-driven campaigning, thereby discerning what constitutes acceptable democratic practice.

As part of a research paper studying campaign strategies deployed by major political parties in the UK via political advertising, we manually coded over 5,000 political ads placed on Facebook by three major UK political parties: the Labour Party, the Conservative Party, and the Liberal Democrats from December 2018 to December 2023.

What we found was the prevalence of numerous identical adverts utilising the same template but with slight variations in messaging. For instance, a Tory advertisement claimed that re-election with more Tory seats would prevent the chaos of a Corbyn-led government or a hung parliament. This ad existed in two versions: 1) “this will help us Get Brexit Done” and 2) “this will help get parliament working again for you”.

Conservative party ad for 2019 campaign, with image of Boris Johnson in the foreground, and a threatening image of Jeremy Corbin, Nicola Sturgeon, and Jo Swinson in the bottom.

 

 

Does it become more challenging to hold parties to their promises if they offer different pledges to different demographics in order to secure voters’ support?

Conservative party ad for 2019 campaign, with a threatening image of Jeremy Corbin, Nicola Sturgeon, and Jo Swinson in the bottom.

 

Both ads, placed by the Conservative Party, were active on Facebook during the same period from November 21, 2019 to November 24, 2019. In otherwise identical advert text, one included Brexit while the other emphasised the gridlock in Parliament. In the EU referendum vote, Colchester voted to leave (53.6 per cent), whilst Battersea voted to Remain (78 per cent). It is likely, therefore, that Brexit-related ads were targeted at and appealed to voters in Leave-inclined regions based on their geographical location.

Tailoring messages based on perceived popularity is an important aspect of the evolving nature of political campaigning via social media. It is important to consider the impact of such tailoring on democracy and accountability. Does it become more challenging to hold parties to their promises if they offer different pledges to different demographics in order to secure voters’ support?

Reinforcement of political beliefs via microtargeting has the potential to increase polarisation, with effects on the political landscape if not democracy itself.

Studies have shown political parties in Britain, Canada and the US have primarily used microtargeting to secure support from their base. Persuading independent and undecided voters or weak partisans of other parties has only been a secondary aim. As deliberative theories of democracy suggest, political decisions should follow fair and reasonable discussion among citizens. If voters are increasingly selectively exposed to content which they are likely to agree with, certain democratic principles are at risk of being undermined.

Reinforcement of political beliefs via microtargeting has the potential to increase polarisation, with effects on the political landscape if not democracy itself. Some have argued that political microtargeting online will self-regulate, as parties need to maintain a consistent image to remain credible. The 2016 campaign successes of Brexit and Trump, despite the whirlwind of contradictions by politicians and parties all captured online, suggest otherwise.

On a positive note, some hope that personalising ads according to specific issues that voters care about could have mobilising potential. Studies on Get-Out-The-Vote campaigning, for example, have found microtargeting to significantly increase voter turnout. Others, however, are less optimistic about the implications of political microtargeting for democracy, expressing concerns over voter manipulation.

The changing political landscape

At the moment, the processing of personal data for targeted advertising is regulated by the UK General Data Protection Regulation, as well as the Privacy and Electronic Communications Regulations, which requires explicit consent from data subjects for direct online marketing. The new Data Protection and Digital Information Bill didn’t make it through to legislation on time for this election. But had it gone through, the bill would have loosened the campaigning laws for political parties using personal data to target voters with tailored social media messages, as long as it is carried out “for the purposes of democratic engagement”. Such a change would have created more room for political parties to access personal data, leading to a likely increase in the scale and sophistication of micro-targeted political adverts across various digital platforms in the upcoming elections.

Meanwhile, the Government has increased spending limits that political parties can spend on campaigning in late 2023 via a statutory instrument. The spending limit has risen from £19.5 million to £35 million, marking an 80 per cent increase. This change is likely to be particularly beneficial for the Tories, a party that is well-resourced and capable of raising large sums for their campaigning, whereas smaller parties will struggle to compete. This historical trend is evident, as in all general elections from 2001 to 2019, the Conservatives outspent Labour (except in 2005, where they were marginally outspent by Labour by 0.5 per cent). More recently, in the third quarter of 2023, the Conservatives outspent Labour by over three times.

AI in political campaigns

The patterns we have observed with parties tailoring their messaging to certain audiences are likely to intensify in the run-up to the General Election. Aside from this, generative AI is expected to emerge as a new player in political campaigning.

Research has shown that microtargeting techniques have the potential to magnify the impact of a deepfake.

For AI’s role in recent elections, one notable case is Slovakia’s 2023 Parliamentary Elections, when fake audio recordings were released two days prior, suggesting that Michal Šimečka and a journalist were in conversation over how to rig the election. This was later proven to be a deepfake but with the timing of the audio’s release has the potential to sway a tight election. In this case, Šimečka was defeated. Another example that sparked controversy was the release of deepfake images of Trump with black voters by Trump supporters across social media, in an attempt to persuade African Americans to vote for him.

Online disinformation has been a persistent problem in the digital landscape, exemplified by the proliferation of fake news and misinformation during the 2016 Brexit referendum and the US presidential elections. As we brace for the forthcoming election cycle, a potential threat that looms large is in the form of AI-driven disinformation, fabricated audios, videos, and deep-fakes. Even OpenAI’s own CEO, Sam Altman, pleaded with the US Congress to enforce AI regulation in political campaigns.

Research has shown that microtargeting techniques have the potential to magnify the impact of a deepfake. Given concerns about both fine-grained targeting and AI-based new information technology, we believe that the central focus of this discourse should be the imperative need for stringent regulation of online content, robust fact-checking mechanisms, and regulation aimed at enhancing the transparency of online campaign material to help voters navigate the information landscape.


Acknowledgements: Samuel Elson, a second-year undergraduate in Sheffield Politics, contributed to drafting the writing on data use and progress on legislation. Lucy Goodson, a second-year undergraduate in Sheffield Politics, contributed to the ideas on the use of AI in political campaigns.

All articles posted on this blog give the views of the author(s), and not the position of LSE British Politics and Policy, nor of the London School of Economics and Political Science.

Image credit: photosync on Shutterstock

Print Friendly, PDF & Email

About the author

JunyanZhu

Junyan Zhu

Junyan Zhu is Research Associate in the Department of Politics and International Relations at the University of Sheffield.

Rachel Isaacs

Rachel Isaacs is a master's student in Social Research at the Methods Institute, Sheffield University.

Posted In: General Election 2024 | Media and Politics
Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported
This work by British Politics and Policy at LSE is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported.