LSE - Small Logo
LSE - Small Logo

Ayça Atabey

June 8th, 2022

Innovating in children’s best interests for a ‘fair’ digital world

0 comments | 4 shares

Estimated reading time: 3 minutes

Ayça Atabey

June 8th, 2022

Innovating in children’s best interests for a ‘fair’ digital world

0 comments | 4 shares

Estimated reading time: 3 minutes

The Digital Futures Commission aims to make children’s best interests a primary consideration in the design of the digital environment. We keep a lookout for good practices and guidelines to help digital innovators embed children’s best interests in their products and services. The Age Appropriate Design Code (the Code) is the first statutory Code of Practice for children’s data protection. Matching the Code’s child rights focus are UNICEF’s Manifesto on Good Data Governance for Children and Policy Guidance on AI for children. Common to all three is the concept of ‘fairness.’ In this blog, Ayça Atabey discusses what is meant by fairness in today’s digital world and why it matters.

All 15 standards of the Code reflect data protection principles set out under the UK General Data Protection Regulation (UK GDPR), particularly the fairness principle. This principle should lie at the heart of all processing activities involving children’s data, because the standards provide

practical measures and safeguards to ensure processing under the GDPR can be considered ‘fair’ in the context of online risks to children.

Exploring the child rights implications of the fairness principle has become more urgent with the COVID-19 pandemic, following a dramatic increase in the use of AI-driven technologies making children’s lives ever more digital by default. Although use of AI-driven technologies creates many opportunities for children, the increasing use of algorithmic analytics and data collection also creates significant risks. Notably, there have been considerable concerns over AI and discrimination, since

Predictive analytics can amplify existing discrimination and bias. Artificial Intelligence is increasingly used to make critical decisions for children, such as allocation of welfare benefits or where schools should be built. When these systems use biased data sets, discrimination can result. (UNICEF Manifesto on Good Data Governance for Children)

Image by Impact Photography from Shutterstock

Important as this is, the fairness principle in data protection law has a broader scope than ‘non-discrimination’. This is even when data processing is not discriminatory it might still be ‘unfair’ by not prioritising children’s best interests. For example, commercially exploitative data processing activities linked to adverse effects on children, or processing data in ways children wouldn’t ‘reasonably expect’, might not be ‘discriminatory’ but they would still infringe the fairness principle and children’s best interests. For a prevalent example, consider EdTech where, as Hillman notes

Data collection and algorithmic modeling propel user profiling and control in ways students and even their teachers may not be aware of and understand.

UNICEF’s Policy Guidance on AI for children also underscores the importance of ‘prioritising fairness and non-discrimination’ for children,’ requiring that

 since there is no one optimal technical definition of fairness to prevent bias, developers need to consider the trade-off of multiple fairness definitions.

How can developers do this? The Code and the recently-published IEEE Standards for Age Appropriate Design Digital Services Framework provide good guidance for calculating this trade-off.

Fairness and transparency

The fairness principle is linked to other principles such as ‘transparency’. As the Information Commissioner’s Office notes, “on a wider level transparency is also intrinsic to the fairness element of Article 5(1).” With children’s best interests in mind, the child-centred fairness and transparency rules mean that organisations must reflect the needs of different groups of children when communicating information to them. This approach also promotes accessibility and inclusiveness, which are key if all children are to benefit from data-driven technologies equally.

Again, compliance with transparency rules alone is insufficient to guarantee that data processing is fair. Here, it is important to note that data protection principles apply cumulatively, meaning that the violation of any one of them causes organisations to be in breach of the UK GDPR even if they have demonstrated compliance in all other areas. Therefore, organisations must comply with the fairness principle. The Code is a great way to achieve this as it explicitly states that online services should follow the Code to help them “process children’s data fairly”.

What does ‘fairness’ look like in a digital world?

The fairness principle of the UK GDPR and the child’s best interests standard of the Code are aligned with UNICEF’s Policy Guidance on AI for children and UNICEF Manifesto. The Guidance on AI for children requires digital innovators to focus on fairness and non-discrimination principles while the Manifesto calls on innovators to prioritise children’s best interests, bearing in mind children’s evolving capacities, their diverse identities and circumstances.

Aligned with UNICEF’s inclusive approach, the Code promotes data protection by design and by default through a children’s best interests lens. Data protection by design requires embedding privacy and data protection principles (including the fairness principle) into the design of data processing activities and business practices. This can be achieved in several ways, including but not limited to giving information to children in a straightforward way they can understand and avoiding deceptive/manipulative language and design at all levels. This is highly relevant to the Digital Futures Commission’s work on creating frameworks and resources for digital designers.

The Digital Futures Commission offers digital innovators a holistic approach through the lens of (child) rights to re-think and re-design the digital environment in two key parts of children’s lives – play and education. For example, digital innovators can deploy Children’s Rights Impact Assessment (CRIA) as a tool to put children’s best interests at the heart of the design of the digital environment. To embed children’s rights in data-driven education systems, the Digital Futures Commission has also explored ways to bridge data governance gaps so that data processed from children in educational contexts respects children’s rights. To enhance children’s playful opportunities, we offer Playful by Design principles, underpinned by academic research and informed by children’s voices. We workshopped these with game designers, before integrating our different workstreams to develop a comprehensive and accessible innovators’ toolkit grounded in children’s rights and accompanied by resources and practical steps to create digital products and services in children’s best interests for a ‘fair’ digital world that children deserve.

Notes


This blog is part of the Guidance for Innovators series. You can view all Digital Futures Commission‘s blogs here.

This text was originally published on the Digital Futures Commission blog and has been re-posted with permission.

This post represents the views of the authors and not the position of the Parenting for a Digital Future blog, nor of the London School of Economics and Political Science.

Featured image: Photo by Videootit

About the author

Ayça Atabey

Ayça Atabey is a lawyer and a researcher, currently enrolled as a PhD student at Edinburgh University. She has an LLM (IT Law) degree from Istanbul Bilgi University and an LLB (Law) degree from Durham University. Her PhD research focuses on the role that the notion of ‘fairness’ plays in the protection of vulnerable data subjects. Her work particularly involves the intersection between data protection, information privacy, and human rights issues. She is a research assistant for the Digital Futures Commission. Prior to this, she worked as a lawyer in an international law firm and has been working as a researcher at the BILGI IT Law Institute.

Posted In: In the news