The Information Commissioner’s Office recently ran a consultation on an age appropriate design code for information society services, (a requirement of the Data Protection Act 2018 which supports and supplements the implementation of the GDPR). LSE researchers Jun Yu, Mariya Stoilova and Professor Sonia Livingstone explain here the key arguments and suggestions that came out of an expert seminar held at the LSE in September 2018.
The European General Data Protection Regulation (GDPR) provides protection for children’s data and safety, but anxiety is mounting over the increasing commercial collection and exploitation of children’s data, with potential consequences inimical to children’s selfhood and agency, rights to data privacy, and their familys’ financial situations. A recent report by the Norwegian Consumer Council warns that many big tech companies employ psychologists to design a “sticky” online environment that nudges children to spend more time online and thereby generate greater profits, sometimes using “dark patterns”.
Yet technologies in today’s ‘datafied’ era are too sophisticated and fast-changing for children (or adults!) to fully understand. Nor should one imagine that children will be resilient in the face of systematic indifference, sometimes abuse, by online actors. Within this context, the Information Commissioner’s Office (ICO) has called for evidence-based proposals for creating a statutory ‘age-appropriate’ code of practice for online service providers in the UK. This post summarises the key points and suggestions from an LSE expert seminar held on 7 September 2018.
The main objective of the Code, mandated by section 123 of the Data Protection Act, is to establish an age-appropriate environment online through systematic re-design of the online environment. This includes providing practical guidelines that service providers can use to regulate harmful contents creating a safer digital “environment” where children can actively engage and safely develop. As Baroness Kidron noted, the purpose is to “retro-fit 150-years of societal knowledge, norms and values into the design of the digital environment” in ways that respect and value children’s rights and well-being. By setting the “normative expectations” about the children’s relationship with technology, the Code should set the standards for positive developments in the future.
Findings from the LSE’s comprehensive mapping of the existing evidence (by Sonia Livingstone, Mariya Stoilova, and Rishita Nandagiri) indicate that young people often care deeply about privacy in interpersonal relationships, but may not recognise how their privacy is also intertwined in their relationships with institutions (such as school) or why commercial owners of the online platforms are interested in their personal information. In other words, children are “naïve experts” who know how to deploy certain privacy protection strategies at the interpersonal level, but are not always fully capable of assessing what is appropriate to share in different contexts of relationships. Their privacy concerns do not always trigger self-protective behaviours, especially when there are compensatory benefits or incentives (e.g. socialising with friends), when they feel in control over their information, or when they trust their audiences and platforms.
Proposals for a safer online environment
But it is surely not fair to leave the burden on children, given they have little ‘real’ choice to select online service providers or control what is shared about them. Their data are not only “(voluntarily) given, traced, or inferred”, but often “taken” because there is no other option than giving data. So, how can we practically build an environment that pays due regard to children’s rights and well-being?
The experts at the seminar made a number of suggestions that can potentially feed into the Code:
- Location of responsibilities: responsibility for managing privacy and protecting personal data should be moved from children and parents to online service providers.
- The best interests of the child: the best interests of all children under 18 years old should be actively respected in policy and practice, and must be ensured by online service providers even when their main target audiences are not children.
- Data minimisation: data must be service-critical and the default privacy setting should be set to the highest level. For example, automated collection of geo-location data or sharing with third parties should be turned off by default and personal data should be collected only for a good reason.
- More special categories of data: data such as children’s physical location deserve a sensitive treatment, as well as health and biometric data.
- Rights by design: children should be able to check, contest, rectify, erase, or edit information about themselves (similar systems exist for editing profiles and or inferences) and to benefit from collective redress.
- Easy-to-understand guidance: children should be able to understand the privacy settings and the intent of corporate access to and collection of their data without having to read through terms and conditions. Introducing a “labelling infrastructure” (like traffic signals) that indicates the level of privacy on offer.
- Media literacy: children have right to know what happens to their data online, so media literacy and privacy education should start at an early age to help children to understand better the digital environment, nature of consent, and the outcomes involved – especially given children’s partial awareness of the complicated nature of online privacy risks.
- Adult support: adults should support children, and the balance between protection and children’s autonomy is essential because the interests of adults and children do not always necessarily align and overprotection may compromise child development.
- The right to make mistakes: this recognises that children are children – playful, naughty, curious – and should not suffer long term consequences from such actions.
Informing the online environment
It is vital to ensure the design of new online environment is “age-appropriate”, because as child development studies demonstrate, children have different needs, skills, and vulnerabilities at different ages. Their desire for privacy increases as they grow older and develop a sense and need of “personal space” and acquire a more complex understanding of privacy over time. Such age difference should map onto the design of online service provision.
However, establishing what these age ranges should be is challenging. While it is reasonable to conceive that there are key periods for child development (such as the age of 11-14 during which children reach puberty and/or move to secondary school, which is currently not part of the ICO’s tentative age brackets), psychological variables and individual differences like gender, ethnicity and socio-economic background create disparities amongst children of the same age. To fill knowledge gaps, the new government-funded “The Nurture Network” will focuses on better design for mental health of children in relation to digitally-mediated peer-to-peer, school, and family contexts. Still, more research with children and their parents is needed to enable the inclusion of children’s own voices and their participation in the decision-making process.
Implementing these and other solutions being called for by civil society actors may face some opposition from the industry if they hinder the existing profit model that increasingly relies on ubiquitous, continuous and automated mode of data collection and use. It will also require further consultation and discussions among all stakeholders. While challenges no doubt remain, it is a good sign that children’s privacy online is attracting attention – including thoughtful and evidence-based consultation responses to the ICO’s call from 5 Rights, Children’s Charities’ Coalition on Internet Safety, Childnet, Children’s Data and Privacy Online project (LSE), Digital Policy Alliance, Home Life Data and Children’s Privacy (Veronica Barassi), Internet Advertising Bureau, Internet Matters, The Diana Award, and many others. We look forward to the ICO’s response and to the continuing discussion.
This article gives the views of the authors, and not the position of the LSE Media Policy Project, nor of the London School of Economics and Political Science.
1 Comments