LSE - Small Logo
LSE - Small Logo

Emma Goodman

Sonia Livingstone

November 27th, 2018

Protection of children online: does current regulation deliver?

0 comments | 2 shares

Estimated reading time: 5 minutes

Emma Goodman

Sonia Livingstone

November 27th, 2018

Protection of children online: does current regulation deliver?

0 comments | 2 shares

Estimated reading time: 5 minutes

More children are going online, more frequently, via more devices and services, at ever-younger ages. The internet and accompanying technological developments offer huge opportunities for children, but they also pose risks to minors’ safety, wellbeing and rights. Despite new and revised legislation, a policy brief published today argues that the current European legislative framework is insufficient. Two of the report’s authors, LSE’s Professor Sonia Livingstone and Emma Goodman, explain why.

There is growing evidence of harm to children online, but addressing the risks is complex because the risks are often highly sensitive, their causes vary, long-term effects are hard to anticipate, and multiple stakeholders are required for the solutions.

The European Parliament has done a great deal to protect the rights of children in the digital environment, as have other stakeholders operating in the single market. But significant harm continues, and calls for action are escalating. So more must be done.

Recent policy developments

The past two decades have prioritised self-regulation, public awareness-raising, technological tools/solutions, and the fight against child sexual abuse online. Recent efforts to protect minors must be understood in relation to wider efforts by the EC to further the Digital Single Market. Our policy brief discusses:

  • The General Data Protection Regulation (GDPR) includes several provisions aimed at enhancing the protection of children’s personal data online, such as obliging service providers to use a clear and plain language that children can easily understand in all information society services that require personal data processing. Although the GDPR’s goal is not specifically to protect children from harm, it may have consequences for child protection.
  • Seeking to create a more level playing field, the recently-revised Audiovisual Media Services Directive (AVMSD) has revised content and advertising rules to created a single unified standard for the obligations of audiovisual media services providers regarding content that might harm minors. This means that video-sharing platforms such as YouTube fall under the revised directive, as does audiovisual content shared on social media services such as Facebook.
  • The Better Internet for Kids (BIK) Policy Map documented wide support for the BIK strategy, demonstrating many successes for child online safety policies. But many gaps remain in policy governance and stakeholder participation, with disappointing few improvements since the last BIK mapping exercise in 2014.
  • Other developments include the EU Human Rights Guidelines on Freedom of Expression online and offline, which includes “media and internet literacy,” and the “Code of conduct on countering illegal hate speech online,” which seems to offer a successful model for co-regulation, although there’s no specific focus on children.

The challenges

Pressing challenges facing GDPR, AVMSD and BIK centre on the implementation of legislation (especially GDPR), the effectiveness of self-regulation (especially AVMSD) and the media literacy of the public (presumed, one way or another, in most initiatives and instruments).

  • Most obviously, parents and children struggle to understand the available options and tools, as well as the risks they face and their responsibilities. Many are also frustrated with and worried by the sense of an unresponsive digital environment that doesn’t cater to their needs, respond to their concerns, or provide the tools they need. And they are confused by the different approaches to provision of options and tools offered by different companies, with parents often unable to find the support they want and children often able to evade the protections in place.
  • While the requirement in the AVMSD that video-sharing platforms to put in place measures to protect minors and others is a welcome move, this places considerable burden on providers to self-regulate in a transparent and effective manner. Any measures implemented must remain compatible with digital intermediaries’ liability exemptions under the E-Commerce directive, but there is no clear guidance on how this is to be achieved.
  • Media literacy is often cited as a solution to societal problems that involve the media. As new issues continue to arise (e.g. the need for critical information literacy given the rise of disinformation and ‘fake news’), it is widely agreed the need for media literacy is only likely to grow. However, there is little knowledge about actual levels of media and information literacy, and it is difficult to imagine a way to effectively deliver media literacy to the adult population, and current legislation does not tackle this issue.

What should be done

We suggest that the European Parliament should establish and promote clear common standards and assist coordination of the stakeholders. Parents, children, teachers and the wider public, as well as the industry, need clarity on what can be expected, and more needs to be done to help the market develop kitemarks, effective filters and other needed actions.

As is widely said but rarely implemented, this must be complemented by stepping up educational and awareness-raising efforts to promote the media literacy of the population. All this not only needs to be done but also to seen to be done, to build public trust at a time of crucial change.

Specifically, we recommend:

  • The creation of a comprehensive Code of Conduct for the converged digital environment that sets minimum standards for providers of services used by children. Ideally these would be embedded into the design of devices and services with the child’s best interests as paramount. To address the accumulating problems of self-regulation, the Code should be underpinned by strong backstop powers, independent monitoring and evaluation, and a trusted and sufficiently-resourced body to ensure compliance. It should guide intermediaries in their child protection responsibilities and provide clear consumer information and protections if services are not intended for children. Thus it should be useful to the industry and thereby support the digital single market, reducing business uncertainty and standardising norms and practice across member states.
  • The adoption of a Recommendation that promotes an integrated approach to media literacy defined broadly to support critical understanding, creative production and participation plus protective actions and technical skills. The scope should be updated as the digital environment evolves. It should be promoted consistently through all relevant EU policies and applied in national contexts from nursery years onwards, including both formal and informal educational and relevant cultural and information institutions, as well as encouraging wider voluntary participation. The reporting obligation in AVMSD is vital, as is appropriate follow up action.
  • For effective coordination, which is significantly lacking at present, we recommend that the Commission should convene a permanent High Level Expert Group to integrate the Code of Conduct, the Recommendation on media literacy, and encourage beneficial actions by Member States. This would provide the essential coordination across multiple stakeholder actions and ensure clear common standards. Further, its work should be inclusive, accountable, timely, independently evaluated and evidence based. It (or a related body) should also be public-facing, with a single and well-publicised point of contact to reach and support the public.
  • We recommend the provision of dedicated European funding to ensure pan-EU data collection on a regular basis to ensure robust, up to date evidence to guide the development of EU policy on the protection of minors in the digital age.

All these actions must include the meaningful participation of children themselves (as is their right to be consulted) and those relevant experts able to represent children’s best interests.

Unless we see take action on online child protection, legal uncertainty and disputes will continue and we run the risk of excessive legislation being enacted in response.

The full policy brief is available here: http://eprints.lse.ac.uk/90731/

This article gives the views of the authors, and not the position of the LSE Media Policy Project nor of the London School of Economics and Political Science. 

About the author

Emma Goodman

Sonia Livingstone

Sonia Livingstone OBE is Professor of Social Psychology in the Department of Media and Communications at LSE. Taking a comparative, critical and contextual approach, her research examines how the changing conditions of mediation are reshaping everyday practices and possibilities for action. She has published twenty books on media audiences, media literacy and media regulation, with a particular focus on the opportunities and risks of digital media use in the everyday lives of children and young people. Her most recent book is The class: living and learning in the digital age (2016, with Julian Sefton-Green). Sonia has advised the UK government, European Commission, European Parliament, Council of Europe and other national and international organisations on children’s rights, risks and safety in the digital age. She was awarded the title of Officer of the Order of the British Empire (OBE) in 2014 'for services to children and child internet safety.' Sonia Livingstone is a fellow of the Academy of Social Sciences, the British Psychological Society, the Royal Society for the Arts and fellow and past President of the International Communication Association (ICA). She has been visiting professor at the Universities of Bergen, Copenhagen, Harvard, Illinois, Milan, Oslo, Paris II, Pennsylvania, and Stockholm, and is on the editorial board of several leading journals. She is on the Executive Board of the UK Council for Child Internet Safety, is a member of the Internet Watch Foundation’s Ethics Committee, is an Expert Advisor to the Council of Europe, and was recently Special Advisor to the House of Lords’ Select Committee on Communications, among other roles. Sonia has received many awards and honours, including honorary doctorates from the University of Montreal, Université Panthéon Assas, the Erasmus University of Rotterdam, the University of the Basque Country, and the University of Copenhagen. She is currently leading the project Global Kids Online (with UNICEF Office of Research-Innocenti and EU Kids Online), researching children’s understanding of digital privacy (funded by the Information Commissioner’s Office) and writing a book with Alicia Blum-Ross called ‘Parenting for a Digital Future (Oxford University Press), among other research, impact and writing projects. Sonia is chairing LSE’s Truth, Trust and Technology Commission in 2017-2018, and participates in the European Commission-funded research networks, DigiLitEY and MakEY. She runs a blog called www.parenting.digital and contributes to the LSE’s Media Policy Project blog. Follow her on Twitter @Livingstone_S

Posted In: Children and the Media | LSE Media Policy Project

Leave a Reply

Your email address will not be published. Required fields are marked *