Culture Secretary Maria Miller called executives from major internet companies to get around the table and do more about harmful content online. This banging of heads was inspired by the recent child murders, after which child pornography was found in the computers of the perpetrators. The Culture Secretary’s meeting is the latest in a line of government officials calling on internet companies “to do something”. However, as I have pointed out previously, while industry should indeed “do something,” indeed, a lot more, to protect children online, there are many pieces to this puzzle, and more stakeholders who should be at the table.
Most importantly, we need two separate strategies – one to address illegal content; one for harmful content. The problem of illegal content requires industry to work with law enforcement to block illegal content and ensure prosecution of offenders. The problem of harmful content requires industry to work with the public and third sectors to provide tools, raise awareness and change practices. Confusing these two strategies risks undermining both.
Industry could do more
Let’s hope the talks between Miller and the industry were not as confused as the wider media discussion about illegal and harmful content – the distinction is crucial between illegal and legal content that may be harmful to children. With illegal content, there harm is an issue in both creation and reception and Internet companies can do more to help track down creators and distributors of such content as well as preventing people from viewing it, intentionally or accidentally.
Google’s latest offer to combat illegal (child abuse and extreme pornography) content is welcome, as the agreement reached at the meeting with Miller that several of the major internet companies including many ISPs would donate £1 million to increase the capacity of the Internet Watch Foundation (though we need the IWF and CEOP to tell us if this is sufficient to ensure the images are removed and offenders caught) or, still, a drop in the ocean.
The debate over search terms – i.e. that Google could identify those typing in certain terms – is a thorny one. Yes, they could and, I think, should flash a warning page and even report the IP address of the user back to them. Users need to know they are tracked, and they need to learn what’s illegal. But search terms are imprecise guides to illegal intent, so care is needed.
Search trends are also useful for experts tracking illegal content, and for creating better filters for legal but harmful content. This is where more pro-active, behind the scenes efforts are needed – industry can and should harness what they know of users’ practices. But there should also be independent oversight of how this works, to avoid covert censorship and unaccountable ‘policing’ by the private sector.
In terms of legal but potentially harmful content, independent oversight is now also urgent for the plethora of filtering ‘solutions’ entering the market as pressure ramps up for ‘active choice’ for parents, and a host of end-user solutions on diverse devices. As I’ve asked before – but with no answers on the horizon – how do we know if these work? What are the rates of under and over blocking? Where is the evidence that parents can actually use them (or that kids can get around them)? Most important – are they protecting kids in practice? Especially those who are younger or more vulnerable or without sympathetic parental support.
Put simply, without industry provision of transparent and effective tools, parents won’t use them, civil liberty experts won’t trust them, child welfare professionals won’t rely on them, and some in the wider public will continue to cry ‘censorship’. We need better filters that parents can easily understand, use and trust. We also need a trusted body that is charged with testing filters, not just for usability for parents, but also in relation to the risks of over-blocking.
If and when good filtering tools become available, we could have confidence in them, and we should expect an awareness-raising campaign – funded by industry? – to encourage their use, especially for young children. People simply don’t know that Google or YouTube have safe search functions on their home pages. And they don’t know – who does? – which filter is best or whether to trust the one offered by their ISP.
Others need to be at the table
While a vital part of the solution, industry cannot and should not do this alone. Parents and children have rights and concerns that need to be part of the discussion. Freedom of expression issues are crucial, and fully grasping these requires independent legal, regulatory and technical expertise.
In relation to legal but harmful content accessed by children, there will always be a ‘grey area’ to be managed by parents and educators, especially in relation to teenagers who cannot live a highly filtered online existence, who must be permitted to experiment and explore. Helping parents get more tech-savvy, conversation starters to get them talking about sexual matters with their children, alerting them to warning signs of a child in difficulties – all this is important.
What exactly goes into the school curriculum should, also, be much more widely debated. Parents hope schools can deal with difficult issues but teachers are worried about getting this wrong, about being criticised for raising intimate sexual themes or alerting innocent children to the existence of sexual violence. Nor do they themselves grasp just how the internet is implicated in the representation, distribution and amplification of potentially harmful images. Contra the Government’s recent consultation on the ICT curriculum, this is a matter for PSHE not ICT teachers.
Conclusion
Let’s hope Miller extracts some concrete promises from her meeting with industry. But it’s vital not to be satisfied with the promises alone. What’s also crucial is following through to ensure the effectiveness of strategies to combat – in different ways – both illegal and legal but harmful content. This requires independent assessment not only of the strategies put in place but of their impact on reducing children’s experience of harm. Even then, this can only be one necessary part of the larger picture of enabling our children to grow up as confident and resilient citizens, online and offline.
This post was based on the comments made during the BBC programme Newsnight on 17 June 2013. The post gives the views of the author, and does not represent the position of the LSE Media Policy Project blog, nor of the London School of Economics.
Very good points – but let’s accept that IWF or CEOP (with its budget cut) are not the best organisations to assess their own budgetary needs. That should be for external audit if it is increasingly a co-regulatory function.
In terms of multi-stakeholders, I suggest a representative of the 63% of households without a child living in them should also be represented? That would be a democratic solution.