LSE - Small Logo
LSE - Small Logo

Sonia Livingstone

June 18th, 2013

Protecting Children Online: Two Strategies in Which Industry Could Do More

1 comment | 1 shares

Estimated reading time: 5 minutes

Sonia Livingstone

June 18th, 2013

Protecting Children Online: Two Strategies in Which Industry Could Do More

1 comment | 1 shares

Estimated reading time: 5 minutes

Sonia LivingstoneCulture Secretary Maria Miller called executives from major internet companies to get around the table and do more about harmful content online. This banging of heads was inspired by the recent child murders, after which child pornography was found in the computers of the perpetrators. The Culture Secretary’s meeting is the latest in a line of government officials calling on internet companies “to do something”. However, as I have pointed out previously, while industry should indeed “do something,” indeed, a lot more, to protect children online, there are many pieces to this puzzle, and more stakeholders who should be at the table.

Most importantly, we need two separate strategies – one to address illegal content; one for harmful content. The problem of illegal content requires industry to work with law enforcement to block illegal content and ensure prosecution of offenders. The problem of harmful content requires industry to work with the public and third sectors to provide tools, raise awareness and change practices. Confusing these two strategies risks undermining both.

Industry could do more

Let’s hope the talks between Miller and the industry were not as confused as the wider media discussion about illegal and harmful content – the distinction is crucial between illegal and legal content that may be harmful to children. With illegal content, there harm is an issue in both creation and reception and Internet companies can do more to help track down creators and distributors of such content as well as preventing people from viewing it, intentionally or accidentally.

Google’s latest offer to combat illegal (child abuse and extreme pornography) content is welcome, as the agreement reached at the meeting with Miller that several of the major internet companies including many ISPs would donate £1 million to increase the capacity of the Internet Watch Foundation (though we need the IWF and CEOP to tell us if this is sufficient to ensure the images are removed and offenders caught) or, still, a drop in the ocean.

The debate over search terms – i.e. that Google could identify those typing in certain terms – is a thorny one. Yes, they could and, I think, should flash a warning page and even report the IP address of the user back to them. Users need to know they are tracked, and they need to learn what’s illegal. But search terms are imprecise guides to illegal intent, so care is needed.

Search trends are also useful for experts tracking illegal content, and for creating better filters for legal but harmful content. This is where more pro-active, behind the scenes efforts are needed – industry can and should harness what they know of users’ practices. But there should also be independent oversight of how this works, to avoid covert censorship and unaccountable ‘policing’ by the private sector.

In terms of legal but potentially harmful content, independent oversight is now also urgent for the plethora of filtering ‘solutions’ entering the market as pressure ramps up for ‘active choice’ for parents, and a host of end-user solutions on diverse devices. As I’ve asked before – but with no answers on the horizon – how do we know if these work? What are the rates of under and over blocking? Where is the evidence that parents can actually use them (or that kids can get around them)? Most important – are they protecting kids in practice? Especially those who are younger or more vulnerable or without sympathetic parental support.

Put simply, without industry provision of transparent and effective tools, parents won’t use them, civil liberty experts won’t trust them, child welfare professionals won’t rely on them, and some in the wider public will continue to cry ‘censorship’. We need better filters that parents can easily understand, use and trust. We also need a trusted body that is charged with testing filters, not just for usability for parents, but also in relation to the risks of over-blocking.

If and when good filtering tools become available, we could have confidence in them, and we should expect an awareness-raising campaign – funded by industry? – to encourage their use, especially for young children. People simply don’t know that Google or YouTube have safe search functions on their home pages. And they don’t know – who does? – which filter is best or whether to trust the one offered by their ISP.

Others need to be at the table

While a vital part of the solution, industry cannot and should not do this alone. Parents and children have rights and concerns that need to be part of the discussion. Freedom of expression issues are crucial, and fully grasping these requires independent legal, regulatory and technical expertise.

In relation to legal but harmful content accessed by children, there will always be a ‘grey area’ to be managed by parents and educators, especially in relation to teenagers who cannot live a highly filtered online existence, who must be permitted to experiment and explore. Helping parents get more tech-savvy, conversation starters to get them talking about sexual matters with their children, alerting them to warning signs of a child in difficulties – all this is important.

What exactly goes into the school curriculum should, also, be much more widely debated. Parents hope schools can deal with difficult issues but teachers are worried about getting this wrong, about being criticised for raising intimate sexual themes or alerting innocent children to the existence of sexual violence. Nor do they themselves grasp just how the internet is implicated in the representation, distribution and amplification of potentially harmful images. Contra the Government’s recent consultation on the ICT curriculum, this is a matter for PSHE not ICT teachers.

Conclusion

Let’s hope Miller extracts some concrete promises from her meeting with industry. But it’s vital not to be satisfied with the promises alone. What’s also crucial is following through to ensure the effectiveness of strategies to combat – in different ways – both illegal and legal but harmful content. This requires independent assessment not only of the strategies put in place but of their impact on reducing children’s experience of harm. Even then, this can only be one necessary part of the larger picture of enabling our children to grow up as confident and resilient citizens, online and offline.

This post was based on the comments made during the BBC programme Newsnight on 17 June 2013. The post gives the views of the author, and does not represent the position of the LSE Media Policy Project blog, nor of the London School of Economics.

About the author

Sonia Livingstone

Sonia Livingstone OBE is Professor of Social Psychology in the Department of Media and Communications at LSE. Taking a comparative, critical and contextual approach, her research examines how the changing conditions of mediation are reshaping everyday practices and possibilities for action. She has published twenty books on media audiences, media literacy and media regulation, with a particular focus on the opportunities and risks of digital media use in the everyday lives of children and young people. Her most recent book is The class: living and learning in the digital age (2016, with Julian Sefton-Green). Sonia has advised the UK government, European Commission, European Parliament, Council of Europe and other national and international organisations on children’s rights, risks and safety in the digital age. She was awarded the title of Officer of the Order of the British Empire (OBE) in 2014 'for services to children and child internet safety.' Sonia Livingstone is a fellow of the Academy of Social Sciences, the British Psychological Society, the Royal Society for the Arts and fellow and past President of the International Communication Association (ICA). She has been visiting professor at the Universities of Bergen, Copenhagen, Harvard, Illinois, Milan, Oslo, Paris II, Pennsylvania, and Stockholm, and is on the editorial board of several leading journals. She is on the Executive Board of the UK Council for Child Internet Safety, is a member of the Internet Watch Foundation’s Ethics Committee, is an Expert Advisor to the Council of Europe, and was recently Special Advisor to the House of Lords’ Select Committee on Communications, among other roles. Sonia has received many awards and honours, including honorary doctorates from the University of Montreal, Université Panthéon Assas, the Erasmus University of Rotterdam, the University of the Basque Country, and the University of Copenhagen. She is currently leading the project Global Kids Online (with UNICEF Office of Research-Innocenti and EU Kids Online), researching children’s understanding of digital privacy (funded by the Information Commissioner’s Office) and writing a book with Alicia Blum-Ross called ‘Parenting for a Digital Future (Oxford University Press), among other research, impact and writing projects. Sonia is chairing LSE’s Truth, Trust and Technology Commission in 2017-2018, and participates in the European Commission-funded research networks, DigiLitEY and MakEY. She runs a blog called www.parenting.digital and contributes to the LSE’s Media Policy Project blog. Follow her on Twitter @Livingstone_S

Posted In: Children and the Media | Filtering and Censorship | Internet Governance

1 Comments