LSE - Small Logo
LSE - Small Logo

Blog Administrator

July 18th, 2014

Children’s Online Risks Diversifying; Some Self-Created

0 comments

Estimated reading time: 5 minutes

Blog Administrator

July 18th, 2014

Children’s Online Risks Diversifying; Some Self-Created

0 comments

Estimated reading time: 5 minutes

Benjamin de la PavaMany parents worry about what their children might encounter online and some might use filters to block sexually explicit content, but LSE’s Benjamin De La Pava reports that new research shows the risks children face are increasingly related to user generated content including what children themselves are sharing.

Particularly in the public eye, discussions about internet safety for children tend to suffer from a tunnel vision that privileges pornography over all other risks, as Tony Anscombe from Anti-Virus Guard (AVG) complained in his blog last week. While porn might still feature most in tabloid headlines and ministerial speeches, the recent Child Internet Safety Summit showed that expert policy discussions are broadening this debate. Most importantly, new evidence demonstrated that the risks children face are changing and diversifying, and increasingly include those associated with the content they create themselves.

A multitude of risks

Released to coincide with the Summit, a new report from Net Children Go Mobile highlights a shift in trends in the online risks reported by children in the UK between 2010 and 2013.  On the bright side, this reveals no substantial change in exposure to porn or the likelihood of meeting online contacts offline.

More problematically, however, there has been a notable broadening of online risks encountered by children in the UK, and these all concern user-generated content. Specifically, the UK’s 11-16 year olds report more race hate (23%), self-harm (17%) and pro-anorexia (14%) content. Despite this, figures on children’s experiences in the UK are a source of relative optimism when compared to the European average in terms of making new contacts online (17% compared to 26%) and reception of sexual messages (4% compared to 11%).

What they do now, may haunt them later. photo by Lars Plougmann CC BY-SA 2.0
What they do now, may haunt them later.
photo by Lars Plougmann CC BY-SA 2.0

Different children, different needs

In terms of facing these risks, it would be misguided to suppose that content filters, policies, recommendations, applications and safety measures apply to all children and parents equally. Rather, it is vital to understand the important differences between children of different ages and backgrounds and what they do online. These differences lead to variations in their skills, activities, the risks they face and how they respond to them.

Companies, teachers, policy makers and parents need to act upon this knowledge, not in spite of it. Furthermore, while there are signs that parents are engaging more with their children’s online lives, there is still a proportion that gives little to no support to their children.  Reaching those parents is a pressing challenge for future efforts to protect children online.

Digital footprint danger

One danger children face online rarely discussed in mainstream public discourse is that of leaving lasting traces that can have serious impact on a child’s future.  At the Summit, John Carr argued convincingly that mismanagement of one’s digital footprint can have real life consequences, especially later on as a young adult on the job market. The recent Google Spain case in the EU that reinforced the right of an individual “to be forgotten” set a precedent for treating internet services, such as search engines, as controllers of personal data and therefore subject to the obligations of data protection rules.

Children in the EU may be able to exercise a right to be forgotten in the EU if as young adults they want to clean up their digital footprint, but the EU’s Data Protection Directive is under revision putting current rules at risk, and children outside the EU so far cannot look forward to such protections.  The generation of “net children” growing up immersed in a culture of sharing and surrounded by systems gathering data on their online behaviour and preferences might not be able to count on being able to remove links to what they are doing now, so creating awareness of a digital footprint is an important start.

Wider picture, wider responsibility

No matter how responsible a child is taught to be with his or her online persona, this task does not fall solely unto the child or his parents. The industry has a responsibility to help with this by making it clear and accessible how data will gathered and be kept safe, and what can be removed. One of many ways forward is the development of clear and easy-to-understand guidelines for industry tools to ensure transparency in both the developer’s and the user’s rights.

We need to widen the picture of the risks in order to improve and direct future safety efforts. This entails also discussing what can and should be asked of each stakeholder. While it would be shortsighted to negate the importance of the headline grabbing risks like sexual content, cyberbullying and ‘stranger danger’, it is equally important to look carefully at other less publicised situations that might affect all children online, including the risks involved in their digital footprint and the use of their personal data.

This post gives the views of the author, and does not represent the position of the LSE Media Policy Project blog, nor of the London School of Economics. 

About the author

Blog Administrator

Posted In: Children and the Media | Filtering and Censorship | Guest Blog | Internet Governance

Leave a Reply

Your email address will not be published. Required fields are marked *