LSE - Small Logo
LSE - Small Logo

Luke McDonagh

Emma Perot

Luis Porangaba

March 13th, 2025

Should we introduce a new right against AI replicas?

0 comments

Estimated reading time: 4 minutes

Luke McDonagh

Emma Perot

Luis Porangaba

March 13th, 2025

Should we introduce a new right against AI replicas?

0 comments

Estimated reading time: 4 minutes

There is an increasing concern by artists about the impact of new AI technologies. One of those worries is AI’s ability to create digital replicas of people’s voices and image. Luke McDonagh, Emma Perot and Luis Porangaba ask whether it’s time the UK introduced a new right to protect against that.


Enjoying this post? Then sign up to our newsletter and receive a weekly roundup of all our articles.


Given recent high-profile stories about AI cloning or manipulation of the voices of famous personalities such as Scarlett Johansson, David Attenborough and Adrien Brody, it is no surprise that artists and celebrities fear that new AI technologies could have huge consequences for the creative industries. A common concern is that AI could enable widespread unauthorised use  – and according to many, misappropriation – of an artist’s voice, image, or persona via the creation of a “digital replica“. To what extent can (and should) the law prevent this?

Actors’ unions in the UK and US have long maintained that consent should always be sought when seeking to use an actor’s voice in the commercial arena.

A recent report we contributed to shows that, at present, the UK does not have a distinct personality right, image right or right of publicity, but rather a patchwork of legal protections. Despite the absence of a personality right, it is standard practice in the UK that media entities and commercial firms in some sectors seek permission, and make a contract, when aiming to use any celebrity’s persona or image in an endorsement or a piece of advertising. Actors’ unions in the UK and US have long maintained that consent should always be sought when seeking to use an actor’s voice in the commercial arena. Yet, with the rapid spread of easy-to-use AI technologies, some of these standard practices may risk significant disruption.

An ongoing strike by US video game voice actors indicates that AI could be already disrupting established norms in creative industries. Indeed, the US is currently contemplating reform via a Federal “No Fakes” Law aimed at preventing the creation of a digital replica without the individual creator/actor’s consent. Would a similar reform to UK law help to give greater certainty to artists to limit unauthorised use of their voice, image, or likeness? Before we can answer this question, we must consider the patchwork of existing protections: performers’ rights, passing off, data protection.

Existing UK Law on performers’ rights, data protection and passing off

If their performances have been used in training of AI, performers could be remunerated under the existing performers’ rights in the UK Copyright, Designs and Patents Act (CDPA) 1988, which would potentially kick in where the AI is active in the market. However, these rights are limited to capturing performances (e.g., recorded songs or movies) and do not confer the protection of persona in the abstract. As such, this protection offers limited redress for performers potentially affected by digital replicas. Meanwhile, a performer’s data protection rights, including requirements under the GDPR on consent and transparency, could provide some avenue for redress, but in practice this remains highly uncertain.

The question is: can unauthorised commercial use of a digital replica be prevented by the law passing off?

A more plausible option for preventing an unauthorised digital replica of a person would be to turn to the law of passing off. Passing off has three elements: (a) goodwill or reputation in a name, image, likeness etc; (b) misrepresentation to the public, e.g. such that members of the public are likely to believe that the claimant has endorsed the defendant’s goods or services; (c) damage/harm to the person.

The question is: can unauthorised commercial use of a digital replica be prevented by the law passing off? In some cases, yes. The English courts could develop the law to include digital replicas of famous celebrities who have accrued goodwill. While passing off has traditionally been applied to protect against false endorsement in merchandising and advertising, it could offer enough flexibility to be extended to other fora which use personas, such as movies, television, and video games.

Yet, passing off is unlikely to provide an avenue for redress to individuals who do not meet the current standard of goodwill. Unless courts decide to revisit this standard, only well-known actors/celebrities, i.e. those with a substantial reputation, are likely to be able to avail of it (should it be extended to include digital replicas). This potentially limits legal recourse for small-scale artists, such as background actors and video game voice actors. 

The historical, longstanding position in the UK is that image, voice, and likeness are not protected as such.

Moreover, passing off is founded on preventing misrepresentation. As in the recent use of the late Michael Parkinson’s voice, if the use of an AI “clone” voice is done without misrepresentation – i.e. without the pretence that it is the real person – passing off may not be available.

A new personality right in UK law?

Despite the gaps in the current law, we argue it would be premature to introduce a new statutory right such as a personality-based, AI replica right. Evidence from various industries would be needed to fully understand and predict the implications, transaction costs, and normative trade-offs involved. 

As noted above, the historical, longstanding position in the UK is that image, voice, and likeness are not protected as such. This traditional UK position has enabled a permissive environment, which has set rational standards of behaviour, and created expectations that could be shifted by a new statutory right. 

Perhaps of greatest concern is the unintended consequence of limiting freedom of expression. Any new right would have to be tightly defined, and limited in scope to ensure that biographical and historical information could be conveyed, and so that news reporting would not be unduly hindered. 

This is crucial because telling the stories of a famous individual in, say, a “biopic” movie or tv series, which necessitates a “replica” of the individual. An actor is expected to look, speak and behave like the individual portrayed. At present, unauthorised “biopics” are legal in the UK, though they must comply with relevant non-IP laws such as defamation and privacy laws.

It is understandable that artists are concerned about the impact of new AI technologies. Yet, a rush to grant a new UK personality right to individuals, even if well intentioned, might also result in negative consequences.

For the individual person/celebrity, a new right could have a lucrative impact, in that use of their life story could attract a mandatory licensing fee. However, this creates its own problems. The person would have a veto allowing them to censor or restrict the content created. Arguably, this would be detrimental to the public interest, with potentially significant downstream effects on creative industries which currently face no such restrictions. 

Concerns on the term of protection of any new right (and rules on inheritance/succession) would need to be addressed. In many civil law jurisdictions, personality-based rights, such as the “author’s moral right of integrity” in France, are understood to be perpetual, continuing even after the economic rights of the artist have expired. In the digital replica context, such a perpetual term could be excessive. 

It is also worth raising the concern that the individual could be pressured into signing over the personality right to a more powerful player. Inequality of bargaining power in the creative industries often results in such deals being struck, such as when actors or authors waive their moral rights via contract. This occurs in part because of a lack of understanding of what individuals are agreeing to, and in part because of the pressure exerted on the individual to sign.

It is understandable that artists are concerned about the impact of new AI technologies. Yet, a rush to grant a new UK personality right to individuals, even if well intentioned, might also result in negative consequences. Amid the flurry of AI hype, a note of caution is required.


All articles posted on this blog give the views of the author(s), and not the position of LSE British Politics and Policy, nor of the London School of Economics and Political Science.

Image credit: TSViPhoto in Shutterstock


Enjoyed this post? Then sign up to our newsletter and receive a weekly roundup of all our articles.

About the author

Luke McDonagh

Luke McDonagh is Associate Professor at LSE Law. His most recent monograph is Performing Copyright: Law, Theatre and Authorship.

Emma Perot

Emma Perot is a Lecturer in Law at the University of the West Indies.

Luis Porangaba

Luis Porangaba is Senior Lecturer in Intellectual Property Law, School of Law, University of Glasgow.

Posted In: LSE Comment | Media and Communications | Media and Politics