In this post, Rishita NandagiriSonia Livingstone and Mariya Stoilova discuss their study of how children themselves understand their data and privacy online, conducted as part of  ICO-funded research. Evidence has shown that children are particularly unaware of issues surrounding commercial use of data. Sonia Livingstone is Professor of Social Psychology in the Department of Media and Communications at the London School of Economics and Political Science. Mariya Stoilova is a Post-doctoral Research Officer on the Global Kids Online project at LSE, and an Associate Lecturer in Psychosocial Studies at Birkbeck, University of London. Rishita Nandagiri is a PhD Candidate at the LSE’s Department of Social Policy (Demography and Population Studies) and an external Graduate Associate member of the Centre for Cultures of Reproduction, Technologies and Health, the University of Sussex.

“Well we don’t actually know where the information is going. You can sign up for an app and tell them your name and your age and stuff and they’ll say at the bottom that it’s all private and stuff, but then it goes somewhere. There’s the question of where does it go” (boy, age 16)

“I think we share a lot more than we actually think we do” (girl, age 16)

In a datafied age, do children grasp how the information given and collected about them online leaves traces that are aggregated, profiled and monetised? Data protection regulation offers some protection but how do children think about their privacy online, what decisions do they make? And what should happen next.

In our research since April 2018, we have found lots of available evidence on how children understand their online privacy in relation to family and friends (and strangers) but very little on what they know of how their school treats their data or, importantly, what businesses do with their data. Our systematic mapping revealed that:

  • Children negotiate sharing or withholding of personal information in a context in which networked communication and sharing practices shape their decisions. This context affects how well they can balance privacy with participation, self-expression and belonging.
  • While the commercial use of children’s data is at the forefront of current privacy debates, the empirical evidence suggests that commercial privacy is the area where children are least able to comprehend and manage on their own.
  • The evidence mapping demonstrates that age and other differences among children might influence their engagement with privacy online, but more evidence is needed regarding differences among children.

How is the work progressing?

We then held focus groups with children aged 11-16 (in school years 7, 9 and 11) around the country – and interviewed some of their parents and teachers too.

It turned out that their online privacy and data is something kids want to talk about. They have heard about Cambridge Analytica and the GDPR! They know that when they use a site or app, they’re asked about cookies or data or privacy. And they know that when they search for something, they’re likely to get adverts linked to their search. But children don’t get much chance to talk about what this all means, or what they should do about it. They are sceptical that their parents or teachers know enough to really explain! More results will be reported soon…

What challenges have we experienced?

One interesting challenge has been explaining to schools, children and parents that our project is not about internet safety. In fact, interpersonal risk is so a familiar concern that data protection in relation to commercial organisations gets little attention from parents or schools. So we’ve had to do a lot of explaining – and that’s the challenge our toolkit is designed to solve.

Now we’re in the middle of the next challenge – compiling a set of online tools we reviewed with the help of a children’s jury to work out what’s really useful in explaining to children of different ages about their data. We’ve found plenty of resources on internet safety, and on media literacy in general, but less on data literacy – so we thank our project advisors for their great suggestions. We forward to launching our data and privacy online toolkit soon.

What’s been surprising in the research findings so far?

Given some of the assumptions made about children, we have been interested to discover from our focus group discussions that:

  • Children care a lot about their privacy online, and are often outraged when we explain how their data is monetised by big business. As they say, it’s “none of their business” to know and keep so much personal information about them. Even though it is, precisely, companies’ business.
  • Children often trust their parents and turn to them when they face difficulties online, being an increasingly risk-averse generation. This is encouraging, but it means there’s a real need for organisations ready to advise parents about data and privacy.
  • But by contrast with internet safety, where parents (and teachers) now feel fairly well informed, the parents we interviewed told us they really feel helpless in protecting their children’s personal data from digital companies – often they conjured dystopian visions of the future from The Matrix or Black Mirror or similar sci-fi shows.

What’s next for research on children’s data and privacy online?

We would love to conduct a survey with a nationally representative sample of UK children to see how their understanding of data and privacy online changes with age and other circumstances, and to test our toolkit to see what really works for them. If we can find the funding!

We are also exploring research possibilities in other countries, to understand the role of cultural assumptions about privacy, and national variation in parenting, schooling and the digital context.

Why does all this matter?

Children are often pioneers in the digital environment yet they are uniquely vulnerable to data breaches and exploitation. Consequently they pose a special case for privacy and data protection, and addressing their needs and rights will surely rise up the public and policy agenda with increased commercial exploitation of personal data in a datafied world where children are often the early adopters.

Many are watching these developments because children’s experiences highlight concerns that arise for many other groups who are being short-changed by the digital environment. To put it differently, we cannot continue to think of children as the exception, nor of users as invulnerable and inviolable. The government’s new White Paper on Online Harms represents one way forward. The Information Commissioner’s upcoming consultation on Age-Appropriate Design is another. These regulatory initiatives are vital complements to media education initiatives for, as our research already shows, what children can be taught about their data and privacy online depends heavily on how the digital environment is designed and regulated.

Notes


Thanks to the ICO for funding our research: Read more about our project here.

This post originally appeared on The Children’s Media Foundation site and has been reposted here with permission.

This post gives the views of the authors and does not represent the position of the LSE Parenting for a Digital Future blog, nor of the London School of Economics and Political Science.