The 2016 European General Data Protection Regulation (GDPR) contains several provisions highly relevant to children and young people. In this post, Sonia Livingstone, Professor of Social Psychology at LSE’s Department of Media and Communications, will discuss empirical evidence to explore issues around the application of the GDPR to children’s online activities.
Throughout this blog series, we have been decoding the implications of the GDPR for children, noting that a range of provisions within the GDPR are of particular importance to their rights to participation and protection, to the balance between parental responsibility versus teen autonomy, and to the challenging media literacy task set for both parents and children by the opaque, ‘black-boxed’ algorithmic operation of the internet industry.
It’s increasingly clear to expert observers that the GDPR is going to be difficult to implement once it becomes UK law in May 2018 (though we must, even post-Brexit). Can the evidence already help to anticipate and resolve uncertainties or point out valuable directions? I’ll concentrate here on Ofcom’s Children and parents: media use and attitudes report 2016, as this covers a lot of relevant information about children’s online activities, media literacy, risks, and the response of their parents.
But first, which questions are pressing? Arguably all provisions in the GDPR are relevant to children along with all other internet users – for instance, importantly, Article 35 requires the conduct of impact assessments to identify risks to the rights and freedoms of all data subjects prior to data processing. Just how robustly this will be implemented, and grounded in what evidence, is yet to be determined, and civil society will be watching closely.
The GDPR’s provision for children is riddled with uncertainties
However, given that children merit particular protections because they may be less aware of online risks to their personal data and rights (recital 38), the GDPR explicitly mentions children in relation to:
- Article 8: Verified parental consent required for under 16s (or 13s, if member states so determine) to use ‘information society services’.
- Article 12: Transparency of communication in a ‘concise, transparent, intelligible and easily accessible form, using clear and plain language’ for data subjects, especially children.
- Recital 71: No data profiling of children (now seemingly defined as under 18). This is stated in a recital not an article, so does this mean that children’s personal data should not (and will not) be collected for commercial profiling purposes? Can children’s data even be reliably distinguished from the rest?
- Recital 65: The right to be forgotten (for all, but especially if consent was given when a child). This is often called for by children (and is, indeed, also a concern for parents) and is so much to be welcomed (though unlikely to prove straightforward in practice).
- Article 57: Effort to ‘promote public awareness and understanding of the risks, rules, safeguards and rights in relation to processing’, with special attention to children.
Several questions arise about the relation between children’s online activities, the role of parents and parental consent, and the responsibility of industry and regulators. One of these questions is: How do children understand the privacy and commercial aspects of the internet?
Ofcom’s evidence on children’s media literacy
Each year, Ofcom reports on the results of their survey of a nationally representative sample of children and parents under the broad rubric of ‘media literacy’, providing valuable cross-sectional data and building up a picture of trends over time. Ofcom groups children into 8-11 and 12-15 year olds: to inform the question of whether 13 or 16 year olds can decide for themselves about the opportunities and risks of internet use, I’ll focus on 12-15 year olds. (But note that on the indicators below, the media literacy of 8-11 year olds is generally lower. Also, the media literacy of adults is not always higher, raising questions about whether their parents can and do make informed decisions about young children’s use of services that collect their personal data.)
Where are the gaps in 12-15 year olds’ understanding of the reliability and commercial basis of the internet?
- Around a quarter (27%) of those who use search engines think that information on the websites returned by a Google search can be trusted. While the BBC website is preferred for “true and accurate information about things that are going on in the world,” this has declined from 52% in 2015 to 35% in 2016, while turning to Google for such information has risen from 17% to 30%.
- 20% of 12-15 year olds who use social media sites say that most or all of the information they see on social media sites is true. Of those who visit online news sites, 56% think most or all information on sites or apps about news is true.
- Half know that YouTube (51%) and Google (49%) are funded by advertising and nearly a third don’t know. Only 45% could identify that search results distinguished by an orange box with the word ‘Ad’ written in it were adverts that had paid to appear at the top of the results. 17% thought these were the best results and 17% don’t know.
- 55% of 12-15-year-old internet users are aware of personalised advertising (an increase from 45% in 2015). They are also more likely than in 2015 (57% vs 47%) to be aware that vloggers might endorse products because they are paid to do so.
- 17% agree that “I will give details about myself to a website or app to be able to get something that I want”. 13% of those with a social media profile agree that “getting more followers is more important to me than keeping my information private”, compared to 68% who disagree. 58% think that: “I can easily delete information that I have posted about myself online if I don’t want people to see it.”
Let’s not panic – the implication is that many children have a fair idea of the commercial nature of the internet and how it treats their personal data. But the gaps in knowledge are important. Generally, they are greater among younger children and those from poorer or less educated homes. The evidence does not, therefore, give a ringing endorsement for those aged 13+ to use the internet at will. It also suggests that schools are not teaching children all they need to know in the digital age. And it suggests that platforms and content providers are not dealing fairly with the children using their services.
Trading participation for protection rights
Although no child rights advocate would wish to trade rights as the GDPR seems to require, the evidence is clear: many children do not understand, or are confused about, the commercial nature of the internet. Thus keeping the age of parental consent at 16 may prove beneficial, as will improving the fairness and transparency of online services and public awareness of the risks.
On the other hand, if the UK decides to keep the age of consent at 16 rather than reducing it to 13, and if companies then either restrict their services to 16+ (rather than bothering to obtain parental consent) or parents refuse consent for their 13-15 year olds when asked, we must also ask: what might be the cost to children’s online opportunities? To assess this, we can note that, in addition to most children’s considerable pleasure in online communication and entertainment, among 12-15-year-old internet users in 2016:
- 44% used an internet-enabled device to make a video, 18% music, 16% an animation, 13% a website, 11% a meme or gif, 9% an app or game, 6% a vlog and 4% a robot.
- 30% of 12-15 year olds have gone online for civic activities such as signing a petition, sharing news stories, writing comments or talking online about the news.
While engaged with only by a minority of 12-15s, albeit a fairly large minority, in principle, any or all of this cohort is free to engage in creative and/or civic activities. Thus in practice around one third would lose out if their internet access became significantly restricted, though the loss of such a freedom goes wider. Since youth digital skills and literacies are much called for to enhance learning, participation and future workplace skills, the potential cost is sizeable.
The pressure is on government, regulators and industry to make this work
Will the GDPR help or hinder? The evidence suggests it may indeed increase children’s online data protection (and, thereby, protection from privacy and marketing risks), given that many young users do not understand the conditions of the services they use, and therefore cannot defend their rights to privacy, fairness or redress. It may also reduce other risks: cyberbullying, unwanted pornography, grooming, etc.
But as other posts in this series suggest, this is likely to come at the cost of regulatory confusion and evasion, parent-child tensions or deceit, the social exclusion of teens whose parents don’t or won’t consent, and loss of children’s communicative, creative and civic opportunities to learn and participate in society, as is their right and society’s need.
This post gives the views of the author and does not represent the position of the LSE Media Policy Project blog, nor of the London School of Economics and Political Science.