The Report on Mobile Internet Censorship usefully brings some transparency to an important issue. It is perhaps curious that most people seem not to know that mobile phone companies have deployed an ‘adult content’ filter for quite some time, yet the debate over active choice vs. opt-in or out-out policies for the fixed internet has been hugely controversial. As an observer of these debates, I would comment that:
- It is too simple to pitch child protection against freedom of expression as if the two goals could be traded off against each other. Offline, society has achieved a broadly consensual though undoubtedly complex, demanding and evolving array of means to achieve both goals and, additionally, it has developed some widely trusted processes for regulating and, indeed, challenging these means as necessary. Now we must do the same online. My preferred analogy is that of town planning – a sphere in which we have evolved broadly accountable procedures by which the public environment is managed differently from that of private spaces and by which particular access conditions or careful rules for entry ensure that children do not encounter certain parts of adult society. Of course there are many grounds for concern and complaint – town planning is never popular! – but we cannot imagine life without it, and its complexities are, I suggest, preferable to talk of either bans or laissez faire solutions.
- Thus it is time to move beyond emotive, even alarmist language on both sides. This is a truly difficult and complex landscape: the harms on all sides are still insufficiently understood, and the policy tools available are still developing. There are some key empirical questions still unresolved. What do consumers, understood in all their diversity, really want in terms of both content and technical tools? How far can regulatory and awareness-raising activities influence parents so as to protect and enable their children online, preferably in accordance with their own values and the needs of their children? What proportion of parents can and will take up ‘active choice’ and will that leave a vulnerable minority of children unprotected?
- Rather than accusing either side of ‘failing’, the important thing is that this debate is taking place, policy options are being tried and tested, and that a range of solutions is being actively considered. For internet content (mobile and fixed), one or another form of filtering appears to be the main technical solution on offer. For now, it seems, the question is not whether to filter, but how. And the ‘how’ question should include not only issues of active choice or opt-in or opt-out but also questions of transparency, accountability and effectiveness. For each of these, we are used to high standards being set offline, and it is time to see these also developing online.
- So let us commend the industry for its active efforts to develop technical solutions to the problems that many parents and the public are calling for. At this point, I suggest that the more solutions being trialled, the better, and I would urge that we experiment with active choice, opt-in, opt-out and more, and conduct an independent and comparative evaluation of the results of all these solutions. This is important because research shows that children are encountering unwanted and/or upsetting or inappropriate content online, that parents are very concerned and feel disempowered to act, and that current take-up of available end-user technical solutions is only partial.
- But since the report suggests that some over-blocking occurs (though as a percentage of content transmitted, the usual measure of over-blocking, this appears very small indeed), since there are plenty of other problems with filters also, and since it is unclear that current processes of redress are adequate, let us also demand of the industry that these tools are transparently operated, open to challenge and correction, constantly improving, and independently evaluated in terms of their effectiveness.