LSE - Small Logo
LSE - Small Logo

Sonia Livingstone

December 18th, 2012

Government response to the consultation on parental controls is good news but raises new questions

7 comments

Estimated reading time: 5 minutes

Sonia Livingstone

December 18th, 2012

Government response to the consultation on parental controls is good news but raises new questions

7 comments

Estimated reading time: 5 minutes

The outcome of the Government’s consultation on parental internet controls, published on the 14th of December, urges all ISPs “to actively encourage people to switch on parental controls if children are in the household and will be using the internet.” This is, more or less, taking forward the ‘active choice’ recommendation of the Bailey Review, namely that – initially as new users, then also as existing users – parents should be faced with the choice of whether or not to use parental controls.

This is welcome news to those concerned with improving children’s internet safety in the UK, and it is likely to set a precedent also for policy makers in other countries. Why are such measures needed?

–          Because, as EU Kids Online has found, a sizeable minority (11%) of 9-16 year olds in the UK have seen pornography online (and 2% have seen violent sexual imagery). Among 11-16 year olds, 13% have seen or received hate messages, 8% pro-anorexia or pro-drug messages, and 2% have visited a suicide site in the past year.[1]

–          Also because, while most parents believe it their responsibility to keep their children safe, many are unsure about the risks, confused about what protections to implement, or too embarrassed to deal with sexual or intimate matters in relation to their children.

–          A recent survey by Ofcom, which has lots of interesting statistics on parental views and experiences, suggests that around half of parents don’t talk to their kids about online safety, even though this is often referred to as everyone’s preferred solution.

–          Most worryingly, EU Kids Online found that 11% of UK parents do nothing at all to keep their child safe online (though, to be sure, it also found that around half already use filters [2], the highest proportion across Europe, and most keep an eye on their children’s internet use or check up on what they do online).

–          Our research also found that parents of vulnerable children tend to be less confident about how to keep their child safe online: so relying on parents to ensure children’s safety risks promoting a strategy that helps the internet-savvy ahead of the less savvy parents.

Although the consultation closed in September, the Government has only just responded – because it was overwhelmed by public and stakeholder input; from some 3500 individuals and organisations in all. People really care about this issue. For those keen for public policy in general and communications policy in particular to be conducted in a transparent, inclusive and deliberative manner, this is a healthy state of affairs.

Interestingly, most of the consultation responses did not support the government’s decision, contra some of the panicky media stories we’ve seen this week. This may be because the consultation was hard to follow – as an expert in this field, I was puzzled by a number of the consultation questions, wanting to know the type or level of pornography would be filtered, how filtering could apply differently for different household members, how easy it would be to change my mind, how effectively the filters work, and so on – before saying yes or no to the questions asked.  As a respondent, I was ready to say ‘yes’ to tools that are precise, effective, flexible and independently audited. But is that what is on offer?

However, it seems likely that many responses came from campaigning groups rather than being representative of all parents, so that doesn’t necessarily tell us what the average parent wants. We do know that many parents are both worried and confused about online risks and safety provision. EU Kids Online found that one in three parents are very worried about what their child sees or who they are contacted by online (more than they worry about crime, alcohol or drugs, for instance, where most parents do expect government intervention).

What worries parents a lot about their child, by age and gender

%

Age

All

9-12

13-16

 

Boys

Girls

Boys

Girls

How they are doing at school

46

44

42

35

41

Being treated in a hurtful or nasty way by other children

47

44

36

33

40

Being injured on the roads

48

40

38

32

39

Being contacted by strangers on the internet

35

37

35

43

38

Being a victim of crime

31

25

40

32

33

Seeing inappropriate material on the internet

35

29

27

32

31

Getting into trouble with the police

22

12

33

21

22

Drinking too much alcohol/taking drugs

12

8

26

28

19

Their sexual activities

11

8

16

28

16

None of these

17

26

22

24

22

 

 

 

 

 

 

Question: Thinking about your child, which of these things, if any, do you worry about a lot? (Multiple responses allowed)

Base: UK parents of children aged 9-16 who use the internet.

Rather than trying to guess what parents really want from the Government’s consultation, we should pay attention to a recent, representative survey conducted by YouGov for Talk Talk.[3] This found that:

–          37% of UK adults with children in the household think that active choice (where customers are asked when they sign up to broadband if they want their internet to be filtered or not) should be applied as standard to best protect children online.

–          A further 30% said their internet service should only be filtered if they ask for it.

–          Just 22% thought that default filtering of harmful content, such as pornography, is the best system, where the internet is filtered unless they ask for it not to be.

–          11% said none of these or that they weren’t sure.

We can add up these figures several ways. While only 37% want what the government has decided to do, broadly speaking, an additional 22% want a tougher system, making a majority in favour of intervention (though the 22% are clearly in the minority when it comes to the question of default filtering in the home).

Undermining support for the government’s solution is the fact that many parents have had poor experiences of filters, this possibly making them sceptical of their capacity to solve internet safety problems[4]. They don’t work in many of the languages spoken in British homes today. They generally don’t help with user-generated content (so are no good for bullying or sexting, for instance). Many under-block and/or over-block, without reporting  how often this occurs. And how do you tell which ones are better or worse?[5]

Also undermining such support is genuine distrust in Government among the UK public: if pornography is monitored or restricted today, what will be restricted next month? It’s time to address head on the consultation responses concerned with civil liberties, mission creep, etc.[6]

Clearly, industry has a job to do here in improving what it offers to parents and consumers. Additionally, government has a job to do in building trust. The best way to address both these challenges would be to establish a transparent and independent body – not to regulate the internet industry necessarily but, to borrow Lord Leveson’s  clever solution for press regulation, to regulate a self-regulated industry.

Surely there is now a pressing need to better understand the contexts in which such tools are used so as to identify the design requirements that could meet parental and children’s needs and concerns regarding children’s online safety. In order to achieve this, future tools should be user-friendly, flexible and easily customizable.  Can they, even, cease to be parental ‘controls’ and become parental ‘mediation’ tools that guide, inform and enable as well as limiting children’s online experiences?

In the spirit of encouraging active and open communication regarding e-safety between parents (and teachers) and children, it would be great to see a new generation of parental tools that would allow for more customisation of the online environment so as to cater for diverse backgrounds, contexts of use, family interactions and parental styles.  Such tools should also take into consideration children’s rights, especially those related to privacy and information access – and including privacy from their parents.

Filtering out pornographic or other inappropriate online content represents one useful part of what must be a multi-stakeholder approach – involving industry, government, parents, teachers, police and others. The government has taken a welcome step forward. But there is much more to do if this is to work.


[1] These figures were reported to us by children under conditions of privacy from their parents and the researcher; and they are confirmed by research from Ofcom and others. If anything, they are likely to underestimate the ‘real’ incidence as children can be embarrassed or worried about admitting what they have seen. Further, as internet use rises, so does the incidence of risk: we are now witnessing children using the internet more, at younger ages, and on more diverse platforms and devices.

[2] What exactly they mean by this is unclear, since we know very little about actual usage rates of filtering software, or assessments of its effectiveness, and we especially lack evidence based on in-home observation by independent research (rather than as self-reported by children or parents or, indeed, by companies).

[3] These figures are from the September 2012 YouGov survey of 2010 UK adults online. All figures have been weighted and are representative of all UK adults (18+).

[4] Parents may be right, if this is what they think. EU Kids Online finds that, when we control statistically for the effects of age (and gender, online activities, access and country), any apparent benefit of parental controls in reducing risk seems to disappear. In short, using a filter or not does not seem to reduce the chance of a child encountering online risks. On the one hand, parents install filters for younger children, who encounter little risk anyhow (though that may change as they use the internet more). On the other hand, filters don’t deal with most risks parents worry about (meeting strangers, bullying, user-generated content).  But even for pornography, where you’d expect using a filter to reduce exposure, the effect is minimal – perhaps such exposure these days is more deliberate than accidental, and if kids wish to find it, they’ll get around the filter?

[5] Actually, the EC’s SIP Bench project to test and compare filters includes a handy site that answers this question. But who knows about it?

[6] I’m puzzled as to why everyone accepts virus checking and blocking software. Everyone has it. It filters their spam (“censorship”?), it saves them a lot of trouble too. I don’t hear any protests over its inclusion with computer sales or internet provision. Is there a way to define content highly inappropriate to children so as to achieve similar public acceptance?

About the author

Sonia Livingstone

Sonia Livingstone OBE is Professor of Social Psychology in the Department of Media and Communications at LSE. Taking a comparative, critical and contextual approach, her research examines how the changing conditions of mediation are reshaping everyday practices and possibilities for action. She has published twenty books on media audiences, media literacy and media regulation, with a particular focus on the opportunities and risks of digital media use in the everyday lives of children and young people. Her most recent book is The class: living and learning in the digital age (2016, with Julian Sefton-Green). Sonia has advised the UK government, European Commission, European Parliament, Council of Europe and other national and international organisations on children’s rights, risks and safety in the digital age. She was awarded the title of Officer of the Order of the British Empire (OBE) in 2014 'for services to children and child internet safety.' Sonia Livingstone is a fellow of the Academy of Social Sciences, the British Psychological Society, the Royal Society for the Arts and fellow and past President of the International Communication Association (ICA). She has been visiting professor at the Universities of Bergen, Copenhagen, Harvard, Illinois, Milan, Oslo, Paris II, Pennsylvania, and Stockholm, and is on the editorial board of several leading journals. She is on the Executive Board of the UK Council for Child Internet Safety, is a member of the Internet Watch Foundation’s Ethics Committee, is an Expert Advisor to the Council of Europe, and was recently Special Advisor to the House of Lords’ Select Committee on Communications, among other roles. Sonia has received many awards and honours, including honorary doctorates from the University of Montreal, Université Panthéon Assas, the Erasmus University of Rotterdam, the University of the Basque Country, and the University of Copenhagen. She is currently leading the project Global Kids Online (with UNICEF Office of Research-Innocenti and EU Kids Online), researching children’s understanding of digital privacy (funded by the Information Commissioner’s Office) and writing a book with Alicia Blum-Ross called ‘Parenting for a Digital Future (Oxford University Press), among other research, impact and writing projects. Sonia is chairing LSE’s Truth, Trust and Technology Commission in 2017-2018, and participates in the European Commission-funded research networks, DigiLitEY and MakEY. She runs a blog called www.parenting.digital and contributes to the LSE’s Media Policy Project blog. Follow her on Twitter @Livingstone_S

Posted In: Children and the Media | Filtering and Censorship | Net Neutrality | Privacy

7 Comments