The UK Information Commissioner’s Office (ICO) recently sought views on a draft code of practice on Age appropriate design, aimed at online services likely to be accessed by children. Here, LSE’s Mariya Stoilova and Sonia Livingstone explain the code’s key challenges that they believe must be resolved. Next week, they will launch new findings from the project Children’s data and privacy online: growing up in a digital age, and an online privacy toolkit.
Children (and adults alike) seem to be ‘playing tag’ with a sophisticated and fast-changing digital environment whose privacy parameters are hard to define and consequences even harder to predict. While children’s online activities are increasingly recorded, tracked, aggregated, analysed and monetised, our recent review of the existing evidence shows that children find it hard to understand how and why their data is used. But is the digital climate changing for the better?
Following the introduction of the European General Data Protection Regulation in May 2018, which promises better protection for children’s data and e-safety, the UK’s Information Commissioner has embarked on a new code of practice which sets specific protections for children’s personal data in the spirit of the United Nations Convention on the Rights of the Child (UNCRC) – both following and reinforcing the GDPR. The code identifies 16 standards of age-appropriate design for services which process personal data and are likely to be accessed by children. This includes a range of platforms frequented by children, such social media, online games, web streaming, internet-connected toys, and learning apps.
The ICO’s draft code, recently out for public consultation, is a promising offer of a child-centred re-design of the online environment which can set standards for positive developments in the future. It requires designers and developers to consider the best interests children, taking into account differing ages, capacities and development needs. The code also prohibits the detrimental use of children’s personal data and requires impact assessments to mitigate the risk of harm. The provisions also set requirements related to:
- Transparency: privacy information, terms, policies and community standards must be concise, prominent and in clear language
- Privacy by default: settings must be ‘high privacy’ by default
- Data minimisation: minimum amounts of personal data should be collected (only related to service provision and to active and knowing engagement)
- Data sharing: children’s data should not be disclosed
- Disabling geolocation: location should be off by default and at the end of each session when the location is visible to others
- Parental controls: making it clear to the child when they are being monitored
Certainly a step in the right direction, the Code still needs to resolve a number of challenges, which we identified in our response to the consultation (see also the contribution by 5Rights and our earlier response). There are 6 key challenges which we find particularly important:
- The balance of protection and participation: the requirements set out in the Code might prompt some companies to raise the minimum age restrictions to 18 years, leading to the exclusion of many children who would benefit from a child-friendly version of the service, limiting their opportunities for online participation. In this way, children might be grouped in one category with high privacy protection, preventing their gradual development of digital skills, awareness of risks, and resilience – creating a sudden exposure to a fairly different environment when they reach adulthood.
- Age-appropriateness: establishing what is an age-appropriate design can be difficult as differences within (as well as across) age groups can be substantial. Further developments need to ensure that policy pays special attention to those who may be of greater vulnerability, such as indigenous or ethnic minority children, migrants, children in poor or rural settings or those who have some form of disability.
- Effective age-verification: reflecting the difficulty of establishing the real age of all users, the Code needs to provide more guidance in relation to the age-verification process, for example setting out criteria/ standards for robust age-verification.
- Shifting the responsibility to parents: the need to verify children’s age might require greater parental involvement and responsibility. This might put additional pressure on parents some of whom do not have the digital skills required to handle these controls, the time or awareness why their involvement might be important. Efforts need to be made to ensure that parental controls do not impede children’s rights to independence and agency and do not create inequalities.
- Mechanisms of effective regulation, including of non-UK companies: due to the transnational nature of the internet, regulation is generally difficult. The majority of international internet-related policies and processes have emerged through consensus-building across multiple stakeholder groups (governments, the private sector, civil society) with the aim to develop shared principles and an agreement on internet governance. Even though the Design Code draws on the EU-adopted GDPR, it is going to be the first piece of regulation of its kind, which is likely to create difficulties with the international application and enforcement of data protection and privacy standards.
- Not seeing the code as a ‘silver bullet’, developing children’s media literacy: the effective protection of children’s data and privacy cannot be resolved by the regulation of information society services. Children’s media literacy plays an important part in how they can understand, manage and safeguard their privacy. Efforts need to be made to create a learning environment which allows children to develop not only the necessary technical skills but also a broader understanding of how media and data are created, analysed, distributed, applied, used and commercialised.
We need a child-centred approach that prioritises children’s own voices and experiences within the wider framework of evidence-based policy development. To support this, our team is launching new findings from the project Children’s data and privacy online: growing up in a digital age and an online privacy toolkit developed with the help of children. You can watch the video to get a preview of our findings.
We will use the new evidence to facilitate a discussion with the Information Commissioner’s Office, experts, and various stakeholders at our launch event on 24 June. For more details follow us at @Livingstone_S and #ChildPrivacyOnline.
This article represents the views of the authors, and not the position of the LSE Media Policy Project, nor of the London School of Economics and Political Science.