LSE - Small Logo
LSE - Small Logo

Sonia Livingstone

October 25th, 2012

Should children’s internet use be filtered? Multi-stakeholder prudishness impedes open deliberation

6 comments | 15 shares

Estimated reading time: 5 minutes

Sonia Livingstone

October 25th, 2012

Should children’s internet use be filtered? Multi-stakeholder prudishness impedes open deliberation

6 comments | 15 shares

Estimated reading time: 5 minutes

Around one in three European parents say that they use filtering tools for their child’s computer; one in three is worried about what or who their child may encounter on the internet. But these are not necessarily the same people – parents are more like to use filters if they are themselves regular or confident users; this suggests that some parents may be worried but lack the skills or knowledge to use parental tools. So, is there a problem? What should be done? Can there be a pan-European solution, given that levels of concern and use of filters varies hugely from the UK and Ireland, where half use them, to fewer than one in ten in Romania and Lithuania.

At the recently held Safer Internet Forum in Brussels, teenagers and their parents expressed strong views against filtering tools. It seems that no-one likes the language of control (as in ‘parental controls’). Kids wish to be trusted, guided, and have their rights respected. Parents wish to trust, though they also wish to be informed and supported. The result was legitimate and important advocacy for open, honest and unembarrassed communication between parents and children. So perhaps filtering tools are unnecessary?

Problematically in such discussions, no-one wants to talk about the content to be controlled. While in the early days of the internet, conferences presenters showed pornography, race hate or violent content so that we knew what we are talking about, now it seems we’ve become rather prudish. So we don’t know if people are (or are not) concerned about mild sexual material (such as Page 3). Or explicit sexual content (that would not be shown on prime time television or that is restricted to over 18s in the cinema). Or extreme or violent content pornography that society keeps away from children offline but that can be found online?

At the Safer Internet Forum, then, although many preferred parental responsibility to parental tools, there were dissenting voices. Is the task less to control the kids than to control those who put potentially harmful content online? Can this be done by self-regulation (as for social networking sites) or by self-managing online communities (as Youtube is attempting). Does adherence to online services’ terms and conditions meet parental concerns, or is public/state intervention required? Or are parental tools still needed? And what is to be done about the parents who appear to pay no attention to their children’s online activities or experiences, especially if those are the already disadvantaged or vulnerable children?

Young people also bear some responsibility and, depending on maturity, many would prioritise their right to view sexual content – as part of their rights, their sexual development, and to build digital literacy and resilience. Again the debate is impeded by our prudishness in asking how teenagers access pornography. In some Eastern European countries, screens are still full of the explicit pop-ups that seem to have disappeared in Western Europe (how did that occur, and was it censorship?). But though accidental exposure is reducing, we do expect teenagers to go looking for pornography. But we find it hard to ask them about what they found, and we don’t really know the reaction of younger children (some would include teenagers too) when they witness, say, sexual violence against women – ethically too, this is hard to find out.

Stakeholders in these debates suffer from a further prudishness, I discover – now that listening to the voices of youth is, rightly, in vogue, it is hard to counter or qualify their views. So those who think youth are too liberal, too confident in their own resilience, or too ignorant of dangers online have a hard time getting credibility for their concerns. But as was pointed out, even these carefully selected youth were accompanied on their trip to Brussels – so they do need support from adults and they are, surely as influenced by adult society online as offline (aren’t we all?). Problematic also is the gap between what people know and what they do: as one young person said, ‘we want you to trust us, but we all post without thinking’. And a significant number report being upset on occasion by what’s online.

Thus I prefer to listen to the evidence. EU Kids Online surveyed 25000 9-16 year olds across Europe, and found a small percentage of children express real concerns about the internet. This doesn’t make life easier (how shall we meet the needs of the minority without restricting the freedoms of the majority?) but at least research provides a clearer grasp of the scale of the problem. In short, the problem is real, but it is not overwhelming. The evidence also shows, disappointing to some, that using filtering tools is not associated with a reduction in children’s exposure to online risk. We should think about this. Maybe filtering is the wrong way to go. But maybe the reason is that the children who need protection don’t get it? Or maybe the reason is that the tools don’t work (indeed, the SIP Bench study reveals their continuing limitations – too much overblocking, for instance). And perhaps the kids can get around them (at the Forum, we heard some strategies from the youth for this, though most children respect their parents’ advice).

In the UK, the government’s recently closed consultation will soon report on whether parental filtering will be forcibly offered to parents (‘active choice’) or whether a more top-down solution will be judged necessary. In Europe, EC Vice President Neelie Kroes’ CEO Coalition is progressing its work on parental control tools as one of five priorities for a better internet for children. This is a multi-stakeholder issue, and no simple or single solution is likely to emerge. But having listened to arguments on all sides at the Forum, and having weighed the evidence, I suggest that we should try harder in developing parental filters as one among several important future directions (recall that it took us decades to produce a strategy for content regulation on other media that has widespread support). So here are four recommendations to companies providing online services used by children:

(1)   We need greater availability of both really simple filters that any parent can install, and carefully tailored filters that parents can choose to fit to the maturity of their child and the values of their family. User-centred design is key, and testing on real life families, in the messy complexities of the home, is vital. Such tools should be offered, free, to all internet users – on initially gaining a new device, new internet service, and with periodic reminders thereafter.

(2)   Tools should not simply ban or restrict without explanation or transparency. The days of of spying on, censoring and surveilling kids set a really bad starting point for today’s challenges, and should be firmly left behind. Children’s rights should be respected, and tools should be positively designed so as to promote parent-child discussion about guidelines, norms and values. (just as dialogue boxes prompt users to a range of safety checks).

(3)   Providers must recognise the legitimate concerns of those advocating freedom of expression, including child rights advocates. Thus they must not over-block content, they must report their over- and under-blocking rates. What is filtered should be transparent, accountable and accurate. And providers must be quick to respond to mistakes or biases in their filtering processes.

(4)   Last, providers cannot offer the complete solution, and they should not promise ‘complete safety’ or ‘peace of mind’ to parents. Parents need to be aware that filters are fallible, that the internet is complex and changing, that user-generated content poses particular challenges, and that children have rights to freedom and privacy even from their parents. If something goes wrong, parents need a range of strategies other than just confiscating the smart phone or taking away the laptop. Rather, they must talk to their children, seek help if necessary, and know what they are talking about by becoming competent internet users themselves.

About the author

Sonia Livingstone

Sonia Livingstone OBE is Professor of Social Psychology in the Department of Media and Communications at LSE. Taking a comparative, critical and contextual approach, her research examines how the changing conditions of mediation are reshaping everyday practices and possibilities for action. She has published twenty books on media audiences, media literacy and media regulation, with a particular focus on the opportunities and risks of digital media use in the everyday lives of children and young people. Her most recent book is The class: living and learning in the digital age (2016, with Julian Sefton-Green). Sonia has advised the UK government, European Commission, European Parliament, Council of Europe and other national and international organisations on children’s rights, risks and safety in the digital age. She was awarded the title of Officer of the Order of the British Empire (OBE) in 2014 'for services to children and child internet safety.' Sonia Livingstone is a fellow of the Academy of Social Sciences, the British Psychological Society, the Royal Society for the Arts and fellow and past President of the International Communication Association (ICA). She has been visiting professor at the Universities of Bergen, Copenhagen, Harvard, Illinois, Milan, Oslo, Paris II, Pennsylvania, and Stockholm, and is on the editorial board of several leading journals. She is on the Executive Board of the UK Council for Child Internet Safety, is a member of the Internet Watch Foundation’s Ethics Committee, is an Expert Advisor to the Council of Europe, and was recently Special Advisor to the House of Lords’ Select Committee on Communications, among other roles. Sonia has received many awards and honours, including honorary doctorates from the University of Montreal, Université Panthéon Assas, the Erasmus University of Rotterdam, the University of the Basque Country, and the University of Copenhagen. She is currently leading the project Global Kids Online (with UNICEF Office of Research-Innocenti and EU Kids Online), researching children’s understanding of digital privacy (funded by the Information Commissioner’s Office) and writing a book with Alicia Blum-Ross called ‘Parenting for a Digital Future (Oxford University Press), among other research, impact and writing projects. Sonia is chairing LSE’s Truth, Trust and Technology Commission in 2017-2018, and participates in the European Commission-funded research networks, DigiLitEY and MakEY. She runs a blog called www.parenting.digital and contributes to the LSE’s Media Policy Project blog. Follow her on Twitter @Livingstone_S

Posted In: Children and the Media | Communications Review | Media Literacy

6 Comments