The General Data Protection Regulation, due to become law across the EU in May 2018, proposes introducing 16 as the minimum age at which a person can join an online service without the consent of their parents. In a post based on his open letter to the European Data Protection Supervisor and the Chair of the Article 29 Working Party, John Carr, Expert Adviser to the European NGO Alliance for Child Safety Online and member of the Executive Board of the UK Council for Child Internet Safety, explains why there needs to be more discussion about children’s rights in the GDPR.
The General Data Protection Regulation (GDPR) will have a substantial impact on the way in which adult citizens’ rights to data privacy are protected and enjoyed. However, at its core, it is possible to see much of the GDPR as an evolution of a pre-existing regime which, as a result, is reasonably well understood by a body of lawyers, companies, NGOs and regulators who work in the space.
The same cannot be said in relation to the position of children.
The old order
A critical question which is addressed very directly in the GDPR is the age at which a young person may decide for themselves whether or not to use an online service without the service provider having to obtain the verifiable consent of a parent.
In the legal regime being replaced by the GDPR, established by the Data Protection Directive 1995 the words “children” and “age” do not appear at all. Not once. The Directive merely makes a general declaration about the importance of data being processed “fairly”. In that context, an individual’s age or level of understanding would be a relevant factor but there is little jurisprudence to help spell out what this might mean in practice or at a detailed level.
A de facto age standard emerged
In reality a de facto, single standard based on the age of 13 emerged more or less EU-wide by virtue of decisions taken by the giant US social media platforms pursuant to the terms of a US Federal law, COPPA, 1998.
Because of COPPA businesses directly addressing themselves to young children – meaning persons under 13 – had to obtain verifiable parental consent for the child to be able to join or remain a member of their service. Disney is an example of a company that routinely does this for many if not most of its online services.
However, in relation to sites that sought out an older clientele COPPA imposed no obligation to age verify anyone so the major social media platforms simply drew a line at 13. They said nobody under that age could be a member. While this obviated the need for the sites to engage with the messy and potentially expensive business of seeking parental consent or engaging with age verification it also led to very high levels of misrepresentation by young people who wanted to “hang out in the cool places”. The businesses concerned actively marketed themselves in ways which were bound to have this effect.
The term “more or less EU-wide” is important. In 2007 Spain decided to be different. They expressly abandoned 13 and chose 14 as their local minimum. Holland already had a standard of 16. It should be noted that today in Google’s Terms and Conditions, 14 and 16 respectively are stated as the relevant minimum ages for users in those countries whereas, inexplicably Facebook refers only to 13 for everywhere.
The GDPR marks a radical departure from past practice with regard to children. The Commission’s original proposal, published in the draft consultation document issued in 2012 was to recognise and establish a single EU-wide age of consent for data in law and for this to be 13. Essentially this would have entrenched the de facto status quo, forcing Spain and Holland (not to mention the UK and possibly others also) to change their existing rules. However, the proposal was thrown out at the last minute. In its place the final version of Article 8 gave Member States a power to choose any minimum age between 13 and 16. They would do this by way of a “derogation”. Absent such a derogation, the age will become 16 automatically in May, 2018 when the GDPR takes effect.
A question of evidence
The EU is locked into a path which means Member States must now choose within the age range specified in Article 8. However, if the right people move swiftly enough it should still be possible for the Commission, the European Data Protection Supervisor (EDPS) or an individual Data Protection Authority (DPA) to evaluate whatever research currently exists and initiate new, highly targeted research – to help guide national Parliaments or Governments when they make their final decision on derogation.
The new research ought to take account of the contemporary internet and the services used by children. It should have regard to and analyse children’s ability to understand the nature of the commercial environments they are moving into when they join or use different services, how their data is collected and processed and how it is used. Above all the research should be mindful of children’s rights to be consulted on matters affecting them.
It is acknowledged that a great many adults have a poor understanding of data privacy matters but that is not a reason for accepting the same should be true for children. Typically, adults can call on other protective factors.
The matter of grooming
Only in Ireland, Malta and Cyprus is the age of consent to sex higher than any of the available options for consent to data transactions provided in Article 8.
This means that, other than within these three countries, it is possible that the age of consent for data purposes could be the same as or lower than the age of consent to sex. Potentially, therefore, anyone who visits or uses a site or service after May 2018 will, on the face of it, be entitled to assume everyone they encounter is old enough to engage in sexual activity. Will this not compromise or impact upon the operation of the grooming laws in a most unhelpful way? Can this be avoided? This is particularly important because, without an efficient and compulsory age verification regime, if the past is anything to go by there will still be considerable numbers of children on the sites or services who have lied about their age and are, in fact, below the age of consent both to sex and data.
The only available age limit which avoids a potential complication vis-à-vis the grooming laws is 13 because in no EU Member States is the age of consent to sex as low as 13.
Who is a child under the GDPR?
The GDPR does not define what a child is. However, every EU Member State is a signatory to the UNCRC, and the EU itself recognizes the primacy of Treaties such as the UNCRC. The UNCRC defines a child as someone who is under the age of 18.
It would therefore be highly desirable for DPAs and the EDPS at the earliest opportunity to confirm that unless the context specifically provides otherwise wherever the GDPR refers to children it means persons under the age of 18. This is particularly important because of what the GDPR says about “profiling”. Where children are concerned it is forbidden.
Urgent attention is needed
In this short blog I have only been able to highlight some of what I think are the bigger and more pressing points. In this letter I touch on several others but there urgently needs to be an EU-wide discussion on these matters. It ought to involve substantial representation from DPAs, the EDPS, the Commission, other privacy practitioners and online child rights experts.
How and when might such a discussion be organized? Why has the Article 29 Working Party shown no interest in addressing the position of children under the GDPR, despite repeated requests?
This post gives the views of the author and does not represent the position of the LSE Media Policy Project blog, nor of the London School of Economics and Political Science.