LSE - Small Logo
LSE - Small Logo

Sonia Livingstone

May 23rd, 2018

Contemporary child protection on the internet

0 comments

Estimated reading time: 10 minutes

Sonia Livingstone

May 23rd, 2018

Contemporary child protection on the internet

0 comments

Estimated reading time: 10 minutes

On 25 May 2018, the General Data Protection Regulation comes into force in Europe. The technical tools designed for child protection often cannot keep pace with the rapid innovation of digital applications and in this post, Jutta Croll explores the complexities of keeping children safe online. Jutta is heading the project ‘Child Protection and Children’s Rights in the Digital World’, and is chairwoman of the board of Stiftung Digitale Chancen, a foundation actively campaigning against the digital divide. Its mission is to counteract the exclusion of disadvantaged groups from the development towards an information society and further their digital competencies. [Header image credit: R. Sanders, CC BY-SA 2.0]

The internet industry has been developing tools to empower parents in the effort to keep their children safe online – but do they work? Child protection must cover all areas of children’s lives, a safe everyday life in the real world as well as growing up well with media. When articles 19, 34 and 36 of the UN-Convention on the Rights of the Child demand the protection of children from violence, sexual abuse and exploitation, these rights nowadays must be interpreted and applied in a new way regarding risks arising from, or being reinforced through, the internet. But how could contemporary child protection on the internet be envisaged? What are the preconditions for children using the internet for information and participation, for education and play, without being unreasonably endangered? What role does the protection of children’s data play in this regard?

In the early years of the internet, filter programmes were seen as an appropriate tool. Yet, software that identifies certain content based on word-lists, and blocks access to it, was initially not developed to protect children from violent or pornographic content. In fact, the purpose was to prevent staff in companies from surfing on certain improper websites during work hours. The filtering of the content was based either on word lists or on lists of domain names or URLs . Though, from the beginning, this type of filter software was accompanied by the accusation of censorship. The legitimate interest to prevent access to certain types of content for child protection reasons was perceived as a potential gateway to further reaching limitations to the right to freedom of information.

The debate, whether filtering of content is an adequate means of contemporary child protection, is also mirrored in the change over time of the naming of the respective software products. In the decision of the European Parliament to establish a multiannual Community Programme for promoting safer use of the internet and new online technologies, published in 2005 (854/2005/EG), filtering technologies were named explicitly as an instrument for tackling unwanted and harmful content. In the implementation of the programme, the EU Commission released a call for tender for a benchmarking assessment of such technologies and commissioned the task under the title SIP Benchmark. The first benchmarking report, released by Deloitte in 2009, was followed by a second call for tender in the same year. Therein the software products intended for assessment were termed ‘Parental Control Tools’ which has less of an association with censorship of content. The new term also gave higher priority to parents’ responsibility for the protection of children on the internet.

The tests in the following two benchmark studies, (SIP II und SIP III), therefore were more focused on supporting parents, and other persons responsible for children, in their educational tasks. In addition to the effectiveness of the filtering, the tools’ security against circumvention, their additional functionalities and their usability were assessed as well. For many products, good filtering results could only be attested in regard of English language content with a sexual connotation. The tools are less effective in detecting violent or otherwise problematic content and they often block access to unproblematic content by mistake. Simultaneously to efforts to improve filtering, the companies developed their products further and added functionalities to control and restrict access to or monitor children’s internet usage – some of the tools even allow this to be done remotely via a second device in the hands of the parents.

Since the middle of the 2000s, new options for communication and production of content via the internet emerged, collectively known as Web 2.0 and ‘User Generated Content’. Today, social media applications provide the main platforms for interaction between users. Technical tools for child protection, which filter content based on a word list or domain lists, can neither handle the huge amount of user-generated content nor are they able to address the risks resulting from the users’ interaction comprehensively. Herein lies a new potential for endangerment of children that massively challenges contemporary child protection. The technical tools designed for child protection often cannot keep pace with the rapid innovation of digital applications.

In article 3, the UN-Convention on the Rights of the Child gives highest priority to the best interest of the child in all decision-making processes concerning the child. Article 5 obliges the signatories to respect parental rights and demands from the persons responsible for the education of a child “to provide, in a manner consistent with the evolving capacities of the child, appropriate direction and guidance in the exercise by the child of the rights recognized in the present Convention.” For many parents and persons responsible for children’s education it is very difficult to decide what an appropriate manner of guidance, consistent with the evolving capacities of the child, would be with regard to the child’s internet usage.

The I-KiZ – German Centre for Child Protection on the internet has dealt with that question and developed the Intelligent Risk Management Model. Contemporary child protection is based on three pillars ‘Design and Content of the Service’, ‘Technology’ and‚ ‘Media Literacy’. Depending on the age of the child, these pillars can bear different weights. For younger children technical tools provide higher protective effects, as the children grow older their media literacy develops and their individual responsibility becomes more important. For adolescents their ability to manage risks on their own should be trained by media literacy education and should be supported by the design of the services themselves. The ability to make decisions on their own, and to weigh risks, develop individually and depend on personal maturity and conditions. Especially with regard to the internet and the usage of digital media it is necessary to provide orientation for parents. They are very often less acquainted with digital world, usage of social media platforms and digital services than their children. Thus they need technical tools for protection but also pedagogical recommendations that act as guard rails for safety.

On 25 May 2018 the General Data Protection Regulation comes into force in Europe (EU-GDPR) regulating the collection, storage, and processing of personal data. In recital 38, the specific protection needs of children with regard to their personal data are acknowledged for the first time. Article 8 accordingly regulates the conditions applicable to a child’s consent in relation to information society services. For children below the age of 16 years, the GDPR requires consent given or authorised by the holder of parental responsibility over the child for the usage of information society services offered directly to a child. Simultaneously, the GDPR allows member states to set by law a lower age for those purposes, provided that it is not below 13 years.

Across Europe stakeholders from media education, research, data protection authorities and technicians dealing with authentication and age verification are concerned with the question, which age threshold might be reasonable and appropriate. The project children’s-rights.digital and the coordination office for children’s rights at the German Children’s Fund ‘Deutsches Kinderhilfswerk’ are currently holding dialogues with experts, in order to build a knowledge base for the various questions regarding child protection and children’s rights arising from the GDPR. We thereby intend to contribute to contemporary child protection on the internet and to provide guard rails for the safety of children on the internet.

Notes


This post was originally published on children-rights.digital and has been reposted with permission.

This post gives the views of the authors and does not represent the position of the LSE Parenting for a Digital Future blog, nor of the London School of Economics and Political Science.

About the author

Sonia Livingstone

Sonia Livingstone OBE is Professor of Social Psychology in the Department of Media and Communications at LSE. Taking a comparative, critical and contextual approach, her research examines how the changing conditions of mediation are reshaping everyday practices and possibilities for action. She has published twenty books on media audiences, media literacy and media regulation, with a particular focus on the opportunities and risks of digital media use in the everyday lives of children and young people. Her most recent book is The class: living and learning in the digital age (2016, with Julian Sefton-Green). Sonia has advised the UK government, European Commission, European Parliament, Council of Europe and other national and international organisations on children’s rights, risks and safety in the digital age. She was awarded the title of Officer of the Order of the British Empire (OBE) in 2014 'for services to children and child internet safety.' Sonia Livingstone is a fellow of the Academy of Social Sciences, the British Psychological Society, the Royal Society for the Arts and fellow and past President of the International Communication Association (ICA). She has been visiting professor at the Universities of Bergen, Copenhagen, Harvard, Illinois, Milan, Oslo, Paris II, Pennsylvania, and Stockholm, and is on the editorial board of several leading journals. She is on the Executive Board of the UK Council for Child Internet Safety, is a member of the Internet Watch Foundation’s Ethics Committee, is an Expert Advisor to the Council of Europe, and was recently Special Advisor to the House of Lords’ Select Committee on Communications, among other roles. Sonia has received many awards and honours, including honorary doctorates from the University of Montreal, Université Panthéon Assas, the Erasmus University of Rotterdam, the University of the Basque Country, and the University of Copenhagen. She is currently leading the project Global Kids Online (with UNICEF Office of Research-Innocenti and EU Kids Online), researching children’s understanding of digital privacy (funded by the Information Commissioner’s Office) and writing a book with Alicia Blum-Ross called ‘Parenting for a Digital Future (Oxford University Press), among other research, impact and writing projects. Sonia is chairing LSE’s Truth, Trust and Technology Commission in 2017-2018, and participates in the European Commission-funded research networks, DigiLitEY and MakEY. She runs a blog called www.parenting.digital and contributes to the LSE’s Media Policy Project blog. Follow her on Twitter @Livingstone_S

Posted In: Research shows...