Naureen Khan, International Advisor at the National Society for the Prevention of Cruelty to Children, responds to the Mobile Internet Censorship Report, arguing that current filtering systems are not working and that we need an “opt-in” solution in which responsibility is shared.
The internet and associated technologies are an essential part of young people’s lives.
- Internet use is increasingly individualised, privatised and mobile: 9-16 year old internet users spend 88 minutes per day online, on average.
- According to Ofcom, nearly one in two 8-11 year olds (50%) and nearly nine in ten 12-15 year olds (88%) own a mobile phone.
Claire Perry’s (MP) report from her ‘Independent Parliamentary Inquiry into Online Child Protection’ notes ‘children spend increasing amounts of time online, are often more “tech savvy” and knowledgeable than their parents and know how to circumvent or avoid device filters. The result is that children are stumbling across or seeking out pornographic material and that this ready exposure to porn, especially the violent degrading material so easily available via an unfiltered internet connection, is having disturbing consequences.’
The Open Rights Group and the LSE Media Policy Project acknowledges that ‘making sure parents have the tools to give their children safer access to the mobile internet is a worthwhile goal’. The mobile industry and Internet Service Providers such as Talk Talk have taken positive steps to protect children from inappropriate content online. However, the current system of device level, parent activated controls is not working: only a minority of parents choose to operate them, and this number is falling. This is because we know that in practice parents are finding it difficult to put content filters on the range of internet-enabled devices in their homes, and families lack the right information and education on internet safety.
An opt-in system would ensure that the responsibility for protecting children online is shared between parents and ISPs, because both have a role to play. Acting alone neither one can protect children. This is broadly how it works with TV viewing and film ratings to protect children from inappropriate content. The internet should not been seen as any different.
The findings presented in the report are informed by a small scale study without a robust and scientific methodology. It is therefore difficult to generalise from the findings although, we agree with the authors that all forms of filtering should be fully transparent. There should be mechanisms in place to allow for an independent review of systems and processes which would allow filtering for legitimate child protection purposes. The Government’s forthcoming consultation presents a good opportunity to both discuss and find a solution to these issues.
Finally, the report quotes the UN Special Rapporteur for Freedom of Expression, Frank La Rue and highlights the criteria he sets out for restricting access to information online. We agree with the criteria and we also agree with Frank La Rue when he states that ‘legitimate types of information which may be restricted’ in order to ‘to protect the rights of children’. The most effective tool currently available for restricting inappropriate content is an opt-in system.
To date the internet has developed in a relatively unregulated way. As the statistics above show, the internet is so much a part of the fabric of children’s lives that the time has now come for some careful, child centred and evidence based regulation.