The Information Commissioner’s Office’s consultation on an age appropriate design code for information society services, (a requirement of the Data Protection Act 2018 which supports and supplements the implementation of the GDPR) is open for submissions until September 19. The code will provide guidance on the design standards that the Information Commissioner will expect providers of online services, which process personal data and are likely to be accessed by children, to meet. Wendy M. Grossman looks here at some of the key questions to consider. 

Last week, defenddigitalme, a group that campaigns for children’s data privacy and other digital rights, and Sonia Livingstone’s group at the London School of Economics assembled a discussion of the Information Commissioner’s Office’s consultation on age-appropriate design for information society services, which is open for submissions until September 19. The eventual code will be used by the Information Commissioner when she considers regulatory action, may be used as evidence in court, and is intended to guide website design. It must take into account both the child-related provisions of the child-related provisions of the General Data Protection Regulation and the United National Convention on the Rights of the Child.

There are some baseline principles: data minimization, comprehensible terms and conditions and privacy policies. The last is a design question: since most adults either can’t understand or can’t bear to read terms and conditions and privacy policies, what hope of making them comprehensible to children? The summer’s crop of GDPR notices is not a good sign.

There are other practical questions: when is a child not a child any more? Do age bands make sense when the capabilities of one eight-year-old may be very different from those of another? Capacity might be a better approach – but would we want Instagram making these assessments? Also, while we talk most about the data aggregated by commercial companies, government and schools collect much more, including biometrics.

Most important, what is the threat model? What you implement and how is very different if you’re trying to protect children’s spaces from ingress by abusers than if you’re trying to protect children from commercial data aggregation or content deemed harmful. Lacking a threat model, “freedom”, “privacy”, and “security” are abstract concepts with no practical meaning.

There is no formal threat model, as the Yes, Minister episode The Challenge (series 3, episode 2), would predict. Too close to “failure standards”. The lack is particularly dangerous here, because “protecting children” means such different things to different people.

The other significant gap is research. We’ve commented here before on the stratification of social media demographics: you can practically carbon-date someone by the medium they prefer. This poses a particular problem for academics, in that research from just five years ago is barely relevant. What children know about data collection has markedly changed, and the services du jour have different affordances. Against that, new devices have greater spying capabilities, and, the Norwegian Consumer Council finds (PDF), Silicon Valley pays top-class psychologists to deceive us with dark patterns.

Seeking to fill the research gap are Sonia Livingstone and Mariya Stoilova. In their preliminary work, they are finding that children generally care deeply about their privacy and the data they share, but often have little agency and think primarily in interpersonal terms. The Cambridge Analytica scandal has helped inform them about the corporate aggregation that’s taking place, but they may, through familiarity, come to trust people such as their favorite YouTubers and constantly available things like Alexa in ways their adults don’t. The focus on Internet safety has left many thinking that’s what privacy means. In real-world safety, younger children are typically more at risk than older ones; online, the situation is often reversed because older children are less supervised, explore further, and take more risks.

The breath of passionate fresh air in all this, is Beeban Kidron, an independent – that is, appointed – member of the House of Lords who first came to my attention by saying intelligent and measured things during the post-referendum debate on Brexit. She refuses to accept the idea that oh, well, that’s the Internet, there’s nothing we can do. However, she *also* genuinely seems to want to find solutions that preserve the Internet’s benefits and incorporate the often-overlooked child’s right to develop and make mistakes. But she wants services to incorporate the idea of childhood: if all users are equal, then children are treated as adults, a “category error”. Why should children have to be resilient against systemic abuse and indifference?

Kidron, who is a filmmaker, began by doing her native form of research: in 2013 she made a the full-length documentary InRealLife that studied a number of teens using the Internet. While the film concludes on a positive note, many of the stories depressingly confirm some parents’ worst fears. Even so it’s a fine piece of work because it’s clear she was able to gain the trust of even the most alienated of the young people she profiles.

Kidron’s 5Rights framework proposes five essential rights children should have: remove, know, safety and support, informed and conscious use, digital literacy. To implement these, she proposes that the industry should reverse its current pattern of defaults which, as is widely known, 95% of users never change (while 98% never read terms and conditions). Companies know this, and keep resetting the defaults in their favor. Why shouldn’t it be “hide by default”?

This approach sparked ideas. A light that tells a child they’re being tracked or recorded so they can check who’s doing it? Collective redress is essential: what 12-year-old can bring their own court case?

The industry will almost certainly resist. Giving children the transparency and tools with which to protect themselves, resetting the defaults to “hide”…aren’t these things adults want, too?

This post originally appeared on net.wars and is republished with permission and thanks. It gives the views of the author and does not represent the position of the LSE Media Policy Project blog, nor of the London School of Economics and Political Science.

Print Friendly, PDF & Email