The General Data Protection Regulation (GDPR) is due to become law throughout the EU on 25 May 2018. Recent consultations from the UK’s Department for Culture, Media and Sport (DCMS) and the Information Commissioner’s Office (ICO) have sought views on, among other aspects, the implications for children and their data. LSE Professor Sonia Livingstone explains some of the issues that need particular consideration from a children’s rights perspective.
The ICO consultation on consent guidance for the GDPR raised a host of questions regarding the datafication of our once-private lives. Questions of consent for “information society services” (i.e. online services with a commercial dimension) are particularly difficult when it comes to children. What can children consent to? What understanding of data, privacy and the online commercial environment is needed for their consent to be informed? Or, should parents be the ones to give consent for children’s use of such services? Or, again, should age limits restrict children below a certain age from using such services at all? And if so, what age limit and on the basis of what evidence?
These are just a few of the many questions puzzling even experts in this field, as I and my colleagues have been blogging about recently, as I overview here.
The stakeholder community concerned with children’s rights, safety and wellbeing is very concerned at the process of consultation surrounding the GDPR in the UK and in Europe. There is, now, a campaign building to reduce the age of consent in the GDPR (article 8) from 16 to 13, on the grounds that children have the right to participate in the online world. But equally there are serious grounds for concern that at 13 children do not understand how their data is being commercially exploited and their privacy at risk. And it does not help that those claiming that children can give informed consent cite old research relating to an “internet” far less commercial and complex than the one children face today.
The ICO had earlier promised on consultation on children, though now it seems merely to say that “We are continuing our analysis of the GDPR provisions specific to children’s personal data and will look to publish some outputs this year.” While we wait, and hope for clear direction – and time to implement it – before the GDPR becomes law in May 2018, there are some significant indicators in the draft guidance on consent of what is to come. But for many of them it’s hard to see how they will work out in practice:
- When are children’s data to be collected on the basis of legitimate interests or consent (child’s or parent’s)? The ICO says “You may find it beneficial to consider ‘legitimate interests’ as a potential lawful basis instead of consent. This will help ensure you assess the impact of your processing on children and consider whether it is fair and proportionate.” This appears to be an explicit invitation to companies to avoid the need for consent, especially in the absence of specific ICO guidance on the circumstances in which private actors might correctly act in this way.
- What is to happen when the service provider professes not to/doesn’t know if a user is a child whose consent should be sought? These problems are legion, given the number of children using services “under age” for which Facebook, for instance, simply blames the parents, and about which the GDPR seemingly imposes no requirement on companies to make their own assessment of age. Nor, even, does it require the data protection authorities to take action in the cases where companies appear deliberately not to ask – as in the case of Instagram.
- This last problem is most pressing in relation to services used by children, as opposed to those directly offered to or targeted at children – the linguistic slippage between “used by” and “targeted at” is endemic. But crucially, it is services actually used by children, irrespective of the services’ intent, that should protect them from exploitation.
- The ICO further explains that “parental consent will always expire when the child reaches the age at which they can consent for themselves. You need therefore to review and refresh children’s consent at appropriate milestones.” But how this will – or even can – work once a child’s data has been shared with other providers is far from clear. How can the originating data collector possibly maintain records that would permit withdrawal of the data when the child is of age? How will all the small services proving problematic for children comply? The ICO guidance available so far does not explain. And children are greatly worried on this point.
Coinciding with the ICO consent consultation was a consultation from DMCS on GDPR derogations, which includes, as part of an easily missed list, the thorny question of the Processing of Children’s Personal Data by Online Services (Article 8). As I have pointed out before, the intent of Article 8 of the GDPR is to reduce children’s vulnerability to commercial and data risks under the age of 16. But it does so by the simple expedient of preventing their access to all information society services unless they have a parent’s express consent and, presumably thereby, oversight and protection. This is, in essence, a crude and unsatisfactory mechanism to try to bring about a worthy end.
13 or 16?
Do children need such protection? Based on an analysis of 2016 Ofcom data (which is indicative but insufficient, with more research urgently needed on children’s understanding of privacy, consent and the commercial digital environment), broadly speaking teenagers’ commercial media literacy increases from the ages of 12 to 15 (although not necessarily much more thereafter). This suggests that requiring parental consent up to the age of 16 would have benefits in terms of children’s privacy and data protection. Since the evidence suggests that children progressively gain in media literacy with age, experience and maturity, it can be concluded that they should rightfully be protected by parents and regulation when younger than 16.
However, as children grow older, setting rules that they are not allowed to be on social networking sites, for instance, becomes less effective. A requirement for parental consent for older teenagers could, therefore, result both in increased deception and evasion on teenagers’ part and inequalities in who can or cannot obtain parental consent in practice. Also, and even more important in terms of children’s rights in the digital environment, if the UK selects 16 rather than 13 as the age for parental consent of children’s internet use, it will be to the likely and reasonable dismay of teenagers, reducing if not eliminating their opportunities for creative, educational, civic and communicative activities online.
It is possible, perhaps, that the observed gap in commercial literacy between 13 and 16 year olds could be filled by more and better media education in school for all children, certainly from 11 years old (ready for 13), if not earlier, so that they learn the critical skills needed to protect themselves in the commercial environment. If, therefore, the age is to be reduced to 13, one would wish to see a clear and sustained commitment that schools will provide effective and compulsory media literacy education to all in years 7, 8 and 9 if not beginning earlier.
But this could not be enough on its own. Can we really expect our schools to teach children to spot the kinds of algorithmic manipulation of their emotions that, it was recently reported, Facebook may be considering? I say “may” because Facebook has denied any such thing. But then, as an ex-Facebook executive recently stated, they would, wouldn’t they? So surely, if the UK reduces the age of consent in the GDPR to 13, it would have to be accompanied by regulation (and not self-regulation, which is hardly proven to be effective) guaranteeing fairer dealing with children by companies so that children have a fair chance to understand properly, from a younger age, how online services are funded, how their data are treated, and what choices and forms of redress are available to them. Even more important would be ensuring that children are provided with real choices on a granular basis, so that they are not faced with the take-it-or-leave it “choice” to give up their data to join their friends on a popular service or keep their privacy at a real social and informational cost.
Data protection rules were not designed with children in mind. However, as stated in the recently adopted EU Guidelines for the promotion and protection of the rights of the child, policy must ensure a “rights-based approach encompassing all human rights … based on the universality and indivisibility of human rights, the principles of participation; non-discrimination, transparency and accountability.” It is time to embed into the policy process a “children’s rights mindset” at the start of any institutional or regulatory design. Empirical work undertaken by scholars highlights the frustrations children face in adapting to practices constructed by adults and often leave with little or no viable options.[1]
My firm hope, therefore, is that the UK government will:
- Either, reduce the age to 13 to permit children’s online participation but simultaneously find a legal way to prevent companies marketing to, profiting from and profiling or otherwise exploiting a child’s data under the age of 16. And substantially improve media education for all children.
- Or, keep the age at 16 but find ways to publicly fund or otherwise support children’s online participation through provision of non-commercial services of widespread value and interest to them so that they can benefit from the digital age without loss of privacy or unfair exploitation of their data.
The provisions in the GDPR to protect children, their developmental needs and interests at present are unclear at best and woefully inadequate at worst. In practical terms, it is hard to imagine how children’s rights both to participate and to privacy can be protected, whether 13 or 16 is chosen in the UK. It is this larger matter that demands urgent and serious attention.
[1] With thanks to Joseph Savirimuthu, University of Liverpool, for this point.
This post gives the views of the author and does not represent the position of the LSE Media Policy Project blog, nor of the London School of Economics and Political Science.
Hi!
Thanks for the article.
Here’s another issue with article 8 of the GDPR that I’ve been asked by one of our volunteers that works with children in foster care homes:
– How does article 8 applies to children under care (residential care)?
– Who will decide for them?
1.The residential care responsible for the child, I guess… it’s the legal representative of the child during the period of time that the court decides the child needs to be taken into residential care
2. Until when? When the protection measure that brought her into residential care ends?
2.1 When the child turns 18 no doubt but what happens when the child goes back to the family? Will the parents have to validate the decisions made by the residential care responsible for the child?
– Was any of these items been taken into consideration?
Best regards
Tito de Morais
If feels a bit off that the limit set by the USA doesn’t get a mention. There are many of the social media services that are based in the USA, and follow restrictions set by US legislation.
From my own experience, the Age-13 limit looks to match US law. While details also vary between states on sexual age of consent issue, there’s a Federal Age-18 limit that defines child porn. It’s likely that internet companies will have those ages in their terms and conditions of service.
Whatever limits are set in UK law, the 13-year-olds need to be better educated, and the way that UK law treats the 16-18 range is also a problem. Essentially, they can lawfully do things, but talking about them on the internet can get them into all sorts of trouble.
We may not have much choice for the age limits, but there is an ample supply of catering-size cans of worms buried in the wider landscape.