LSE - Small Logo
LSE - Small Logo

Sonia Livingstone

May 11th, 2017

Children’s rights and the GDPR – are the new consultations creating light or further confusion?

3 comments

Estimated reading time: 5 minutes

Sonia Livingstone

May 11th, 2017

Children’s rights and the GDPR – are the new consultations creating light or further confusion?

3 comments

Estimated reading time: 5 minutes

The General Data Protection Regulation (GDPR) is due to become law throughout the EU on 25 May 2018. Recent consultations from the UK’s Department for Culture, Media and Sport (DCMS) and the Information Commissioner’s Office (ICO) have sought views on, among other aspects, the implications for children and their data. LSE Professor Sonia Livingstone explains some of the issues that need particular consideration from a children’s rights perspective.

The ICO consultation on consent guidance for the GDPR raised a host of questions regarding the datafication of our once-private lives. Questions of consent for “information society services” (i.e. online services with a commercial dimension) are particularly difficult when it comes to children. What can children consent to? What understanding of data, privacy and the online commercial environment is needed for their consent to be informed? Or, should parents be the ones to give consent for children’s use of such services? Or, again, should age limits restrict children below a certain age from using such services at all?  And if so, what age limit and on the basis of what evidence?

These are just a few of the many questions puzzling even experts in this field, as I and my colleagues have been blogging about recently, as I overview here.

The stakeholder community concerned with children’s rights, safety and wellbeing is very concerned at the process of consultation surrounding the GDPR in the UK and in Europe. There is, now, a campaign building to reduce the age of consent in the GDPR (article 8) from 16 to 13, on the grounds that children have the right to participate in the online world. But equally there are serious grounds for concern that at 13 children do not understand how their data is being commercially exploited and their privacy at risk. And it does not help that those claiming that children can give informed consent cite old research relating to an “internet” far less commercial and complex than the one children face today.

The ICO had earlier promised on consultation on children, though now it seems merely to say that “We are continuing our analysis of the GDPR provisions specific to children’s personal data and will look to publish some outputs this year.” While we wait, and hope for clear direction – and time to implement it – before the GDPR becomes law in May 2018, there are some significant indicators in the draft guidance on consent of what is to come. But for many of them it’s hard to see how they will work out in practice:

  • When are children’s data to be collected on the basis of legitimate interests or consent (child’s or parent’s)? The ICO says “You may find it beneficial to consider ‘legitimate interests’ as a potential lawful basis instead of consent. This will help ensure you assess the impact of your processing on children and consider whether it is fair and proportionate.” This appears to be an explicit invitation to companies to avoid the need for consent, especially in the absence of specific ICO guidance on the circumstances in which private actors might correctly act in this way.
  • What is to happen when the service provider professes not to/doesn’t know if a user is a child whose consent should be sought? These problems are legion, given the number of children using services “under age” for which Facebook, for instance, simply blames the parents, and about which the GDPR seemingly imposes no requirement on companies to make their own assessment of age. Nor, even, does it require the data protection authorities to take action in the cases where companies appear deliberately not to ask – as in the case of Instagram.
  • This last problem is most pressing in relation to services used by children, as opposed to those directly offered to or targeted at children – the linguistic slippage between “used by” and “targeted at” is endemic. But crucially, it is services actually used by children, irrespective of the services’ intent, that should protect them from exploitation.
  • The ICO further explains that “parental consent will always expire when the child reaches the age at which they can consent for themselves. You need therefore to review and refresh children’s consent at appropriate milestones.” But how this will – or even can – work once a child’s data has been shared with other providers is far from clear. How can the originating data collector possibly maintain records that would permit withdrawal of the data when the child is of age? How will all the small services proving problematic for children comply? The ICO guidance available so far does not explain. And children are greatly worried on this point.

Coinciding with the ICO consent consultation was a consultation from DMCS on GDPR derogations, which includes, as part of an easily missed list, the thorny question of the Processing of Children’s Personal Data by Online Services (Article 8). As I have pointed out before, the intent of Article 8 of the GDPR is to reduce children’s vulnerability to commercial and data risks under the age of 16. But it does so by the simple expedient of preventing their access to all information society services unless they have a parent’s express consent and, presumably thereby, oversight and protection. This is, in essence, a crude and unsatisfactory mechanism to try to bring about a worthy end.

13 or 16?

Do children need such protection? Based on an analysis of 2016 Ofcom data (which is indicative but insufficient, with more research urgently needed on children’s understanding of privacy, consent and the commercial digital environment), broadly speaking teenagers’ commercial media literacy increases from the ages of 12 to 15 (although not necessarily much more thereafter). This suggests that requiring parental consent up to the age of 16 would have benefits in terms of children’s privacy and data protection. Since the evidence suggests that children progressively gain in media literacy with age, experience and maturity, it can be concluded that they should rightfully be protected by parents and regulation when younger than 16.

However, as children grow older, setting rules that they are not allowed to be on social networking sites, for instance, becomes less effective. A requirement for parental consent for older teenagers could, therefore, result both in increased deception and evasion on teenagers’ part and inequalities in who can or cannot obtain parental consent in practice. Also, and even more important in terms of children’s rights in the digital environment, if the UK selects 16 rather than 13 as the age for parental consent of children’s internet use, it will be to the likely and reasonable dismay of teenagers, reducing if not eliminating their opportunities for creative, educational, civic and communicative activities online.

It is possible, perhaps, that the observed gap in commercial literacy between 13 and 16 year olds could be filled by more and better media education in school for all children, certainly from 11 years old (ready for 13), if not earlier, so that they learn the critical skills needed to protect themselves in the commercial environment. If, therefore, the age is to be reduced to 13, one would wish to see a clear and sustained commitment that schools will provide effective and compulsory media literacy education to all in years 7, 8 and 9 if not beginning earlier.

But this could not be enough on its own. Can we really expect our schools to teach children to spot the kinds of algorithmic manipulation of their emotions that, it was recently reported, Facebook may be considering? I say “may” because Facebook has denied any such thing. But then, as an ex-Facebook executive recently stated, they would, wouldn’t they? So surely, if the UK reduces the age of consent in the GDPR to 13, it would have to be accompanied by regulation (and not self-regulation, which is hardly proven to be effective) guaranteeing fairer dealing with children by companies so that children have a fair chance to understand properly, from a younger age, how online services are funded, how their data are treated, and what choices and forms of redress are available to them. Even more important would be ensuring that children are provided with real choices on a granular basis, so that they are not faced with the take-it-or-leave it “choice” to give up their data to join their friends on a popular service or keep their privacy at a real social and informational cost.

Data protection rules were not designed with children in mind. However, as stated in the recently adopted EU Guidelines for the promotion and protection of the rights of the child, policy must ensure a “rights-based approach encompassing all human rights … based on the universality and indivisibility of human rights, the principles of participation; non-discrimination, transparency and accountability.” It is time to embed into the policy process a “children’s rights mindset” at the start of any institutional or regulatory design. Empirical work undertaken by scholars highlights the frustrations children face in adapting to practices constructed by adults and often leave with little or no viable options.[1]

My firm hope, therefore, is that the UK government will:

  • Either, reduce the age to 13 to permit children’s online participation but simultaneously find a legal way to prevent companies marketing to, profiting from and profiling or otherwise exploiting a child’s data under the age of 16. And substantially improve media education for all children.
  • Or, keep the age at 16 but find ways to publicly fund or otherwise support children’s online participation through provision of non-commercial services of widespread value and interest to them so that they can benefit from the digital age without loss of privacy or unfair exploitation of their data.

The provisions in the GDPR to protect children, their developmental needs and interests at present are unclear at best and woefully inadequate at worst. In practical terms, it is hard to imagine how children’s rights both to participate and to privacy can be protected, whether 13 or 16 is chosen in the UK. It is this larger matter that demands urgent and serious attention.

[1] With thanks to Joseph Savirimuthu, University of Liverpool, for this point.

This post gives the views of the author and does not represent the position of the LSE Media Policy Project blog, nor of the London School of Economics and Political Science.

About the author

Sonia Livingstone

Sonia Livingstone OBE is Professor of Social Psychology in the Department of Media and Communications at LSE. Taking a comparative, critical and contextual approach, her research examines how the changing conditions of mediation are reshaping everyday practices and possibilities for action. She has published twenty books on media audiences, media literacy and media regulation, with a particular focus on the opportunities and risks of digital media use in the everyday lives of children and young people. Her most recent book is The class: living and learning in the digital age (2016, with Julian Sefton-Green). Sonia has advised the UK government, European Commission, European Parliament, Council of Europe and other national and international organisations on children’s rights, risks and safety in the digital age. She was awarded the title of Officer of the Order of the British Empire (OBE) in 2014 'for services to children and child internet safety.' Sonia Livingstone is a fellow of the Academy of Social Sciences, the British Psychological Society, the Royal Society for the Arts and fellow and past President of the International Communication Association (ICA). She has been visiting professor at the Universities of Bergen, Copenhagen, Harvard, Illinois, Milan, Oslo, Paris II, Pennsylvania, and Stockholm, and is on the editorial board of several leading journals. She is on the Executive Board of the UK Council for Child Internet Safety, is a member of the Internet Watch Foundation’s Ethics Committee, is an Expert Advisor to the Council of Europe, and was recently Special Advisor to the House of Lords’ Select Committee on Communications, among other roles. Sonia has received many awards and honours, including honorary doctorates from the University of Montreal, Université Panthéon Assas, the Erasmus University of Rotterdam, the University of the Basque Country, and the University of Copenhagen. She is currently leading the project Global Kids Online (with UNICEF Office of Research-Innocenti and EU Kids Online), researching children’s understanding of digital privacy (funded by the Information Commissioner’s Office) and writing a book with Alicia Blum-Ross called ‘Parenting for a Digital Future (Oxford University Press), among other research, impact and writing projects. Sonia is chairing LSE’s Truth, Trust and Technology Commission in 2017-2018, and participates in the European Commission-funded research networks, DigiLitEY and MakEY. She runs a blog called www.parenting.digital and contributes to the LSE’s Media Policy Project blog. Follow her on Twitter @Livingstone_S

Posted In: Children and the Media | Data Protection | Featured | Media Literacy | Privacy

3 Comments