LSE - Small Logo
LSE - Small Logo

Sonia Livingstone

February 27th, 2017

Where are the age restrictions on children’s use of Instagram?

0 comments | 1 shares

Estimated reading time: 5 minutes

Sonia Livingstone

February 27th, 2017

Where are the age restrictions on children’s use of Instagram?

0 comments | 1 shares

Estimated reading time: 5 minutes

The UK Secretary of State for Culture, Media and Sport Karen Bradley has just announced a new Internet Safety Strategy to crack down on risks to children such as cyber-bullying, sexting and online trolls. One way that tech companies claim to protect children is through setting age limits – usually 13 years old – for the use of their social media properties. LSE Professor Sonia Livingstone (who will be advising DCMS on the new strategy) and John Carr, member of the Executive Board of the UK Council for Child Internet Safety, decided to investigate whether and how age restrictions are enforced on photo-sharing app Instagram (owned by Facebook), which is one of the most widely used social networks with more than 600 million users worldwide.

You’d think two child internet safety experts would already have been on Instagram, checking out its offer. But this only occurred to us a couple of weeks ago, and unfortunately we were unimpressed with what we found.

Here’s what Sonia did:

  1. Visit the App Store, click Install, then Open.
  2. Ponder the automatic invitation to Continue with your named Facebook login and digest the fact that personal data is shared across Facebook and Instagram (already problematic for WhatsApp)
  3. Sign in via mobile number (or email) (not via Facebook) and receive a verification code from Facebook (not Instagram).
  4. Once in Instagram and asked for a photo, name and password, followed by “Create username” (any), hit NEXT, noting (if attentive) that “By signing in you agree to our Terms and Privacy Policy.”

The “Terms and Privacy Policy” was clickable, and presumably Instagram (and Facebook) knows if anyone actually clicks it. But since it isn’t necessary (there’s no “I agree” step), and since as we learned recently, Instagram’s T&Cs are not comprehensible by kids or, arguably, their parents, it seems likely that few do.

Nothing in the sign-up process mentions age at all. Even if it’s assumed all children lie about their age and that parents encourage them in this (assumptions the evidence does not support), since you’re not asked, you don’t even need to lie. Indeed, you may not even know there’s an age limit.

Nor, when you sign up, is there any suggestion of reviewing your privacy settings (Sonia was a bit upset by being immediately followed an Iranian news agency which showed gory photos of a person attacked by a dog – and quickly learned the hard way to make her profile private).

John was determined to check that Instagram complies with COPPA so, installing Instagram via Google Play, he considered the little icon which suggested there would be some “Parental Guidance” but could not get it to open before the app had been downloaded. He notes:

At no point in the signing up process was I asked my age or asked to confirm my age. When I had joined, and because was I determined, I was able to click on something that said I had to be 13 to use the service. Then this appeared:

“In the event that we learn that we have collected personal information from a sub-13-year old without parental consent we will delete it.”

But hang on……if I am not asked my age and I am not asked if my parents exist, never mind whether or not they consent to anything, what exactly do you think Instagram’s owners, Facebook, are trying to say?

Promises versus compliance

Why are we telling you this? Because in meeting after meeting on child internet safety, we have been told how the “big players” take their responsibilities to child safety seriously; it’s all the little foreign companies that need bringing into the fold.

As a result, we’ve sat in meeting after meeting discussing child internet safety guidelines, guidance, codes of conduct etc., with no plans or seeming need for monitoring or compliance. Here, for sure, we see the limits of self-regulation.

For instance, in the UK, the key guide “for Providers of Social Media and Interactive Services” recently produced by the UK Council for Child Internet Safety enjoins providers to:

“Be clear on minimum age limits, and discourage those who are too young.”

“Consider different default protections for accounts that are opened by under 18s.”

“Consider using available age verification and identity authentication solutions.”

“Involve parents/guardians if you collect personal data from under-18s.”

Does this happen on Instagram? Not that we could see. The NSPCC’s Net Aware assessment of social media services by children and parents does not give Instagram good marks for risk, signing up, reporting or privacy. Should we now check all the other social media sites? Perhaps, as there’s surely a need for mystery shopper exercises. But then one must deal with the push back from companies, as when the EC tried an independent evaluation of whether industry promises are delivered.

Instagram – highly successful among  big global brands and, therefore, increasingly profitable – says it provides a mechanism for adults to report on under-age users they may know of so that the account can be deleted “if the reported child’s age” can “reasonably” be verified as under 13.

So, parents must report on their kids? Hardly a recipe for domestic harmony or learning about digital citizenship. Do parents actually do this? To the best of our knowledge Instagram (or Facebook) doesn’t report on this. But the blogosphere is full of parental accounts of difficulties with the company in precisely this respect.

Who’s affected?

Ofcom’s latest audit of UK children’s media use shows that Instagram is used by nearly half (43%) of the 26% of online 8-11 year olds who have a social media profile. This works out about 11% of online 8-11 year olds. Since about 90% of 8-11s go online, that’s around 1 in 10 8-11s who use Instagram. Ofcom’s data isn’t sufficiently granular to go further, but from their overall age trends in social media use, we can guess that’s fewer among the 8 year olds, more among the 11s and even more among the 12 year olds. So, it’s a minority of kids who use Instagram under-age. But it’s still a lot of kids. And the risk of harm is real.

Will the General Data Protection Regulation, when it comes in next year, perhaps raising the age to 16 and bringing a lot more teens into the category of “under-age” users, make matters better or worse for children? Would it help if the government agreed with the Children’s Commissioner of England’s call for an independent Children’s Digital Ombudsman? We have some hopes of both approaches.

But one must also wonder if companies clever enough to network millions of people couldn’t also figure out how – clearly and kindly – to prevent under-age users, along with meeting their other commitments made in self-regulatory codes at British and European levels. Or is this a key task now for the just-announced UK Government’s Internet Safety Strategy?

This post gives the views of the author and does not represent the position of the LSE Media Policy Project blog, nor of the London School of Economics and Political Science.

About the author

Sonia Livingstone

Sonia Livingstone OBE is Professor of Social Psychology in the Department of Media and Communications at LSE. Taking a comparative, critical and contextual approach, her research examines how the changing conditions of mediation are reshaping everyday practices and possibilities for action. She has published twenty books on media audiences, media literacy and media regulation, with a particular focus on the opportunities and risks of digital media use in the everyday lives of children and young people. Her most recent book is The class: living and learning in the digital age (2016, with Julian Sefton-Green). Sonia has advised the UK government, European Commission, European Parliament, Council of Europe and other national and international organisations on children’s rights, risks and safety in the digital age. She was awarded the title of Officer of the Order of the British Empire (OBE) in 2014 'for services to children and child internet safety.' Sonia Livingstone is a fellow of the Academy of Social Sciences, the British Psychological Society, the Royal Society for the Arts and fellow and past President of the International Communication Association (ICA). She has been visiting professor at the Universities of Bergen, Copenhagen, Harvard, Illinois, Milan, Oslo, Paris II, Pennsylvania, and Stockholm, and is on the editorial board of several leading journals. She is on the Executive Board of the UK Council for Child Internet Safety, is a member of the Internet Watch Foundation’s Ethics Committee, is an Expert Advisor to the Council of Europe, and was recently Special Advisor to the House of Lords’ Select Committee on Communications, among other roles. Sonia has received many awards and honours, including honorary doctorates from the University of Montreal, Université Panthéon Assas, the Erasmus University of Rotterdam, the University of the Basque Country, and the University of Copenhagen. She is currently leading the project Global Kids Online (with UNICEF Office of Research-Innocenti and EU Kids Online), researching children’s understanding of digital privacy (funded by the Information Commissioner’s Office) and writing a book with Alicia Blum-Ross called ‘Parenting for a Digital Future (Oxford University Press), among other research, impact and writing projects. Sonia is chairing LSE’s Truth, Trust and Technology Commission in 2017-2018, and participates in the European Commission-funded research networks, DigiLitEY and MakEY. She runs a blog called www.parenting.digital and contributes to the LSE’s Media Policy Project blog. Follow her on Twitter @Livingstone_S

Posted In: Children and the Media | Data Protection | LSE Media Policy Project

Leave a Reply

Your email address will not be published. Required fields are marked *