LSE - Small Logo
LSE - Small Logo

Sonia Livingstone

June 3rd, 2013

Legal & Illegal Porn: Don’t Leave Child Protection Only to Industry

0 comments

Estimated reading time: 5 minutes

Sonia Livingstone

June 3rd, 2013

Legal & Illegal Porn: Don’t Leave Child Protection Only to Industry

0 comments

Estimated reading time: 5 minutes

Sonia LivingstoneThe recent tragic killing of another child in the UK has reignited calls for blocking online pornography. In this context it is really important that a clear distinction is sustained between illegal content (i.e. child abuse images) and legal content (most pornography).

For example, on 31 May a BBC headline referred to calls to “block more online porn sites”. This is misleading in that it appears to confuse recent debates over children seeing ‘adult content’ (i.e. pornography which is legal, but potentially harmful for children) and images of child abuse (in which both the abuse and the creation and distribution of images of it are illegal).[1] This is most unfortunate as there is, I suggest, a legitimate debate being held about whether and how to restrict the former (given important considerations of free speech and concerns about censorship), but there can be no legitimate debate about whether to restrict the latter – illegal content should be banned in the practice as well as the letter of the law. The BBC should not confuse these matters.

There are two grey areas that complicate this distinction. First there is extreme pornography (graphic violence against women, for instance), which is illegal in some countries, including in the UK, but difficult to enforce. Here I think society should have a debate and make some decisions: if this is illegal, it should be banned, but the means by which it is banned must not over-block or over-restrict legitimate freedoms. Conversely, if we’re not going to enforce such a law, why have we made such content illegal in the first place?

The second complication is content that is legal for adults but illegal for children.  Society is used to permitting the circulation of legal/non-abusive pornography depicting adults, but it is equally used to ensuring regulations, and regulators, which prevent children accessing it (e.g. sales via regulated sex shops, ‘adult’ video ratings, use of credit card pay walls, etc.). Online, this is clearly not working, hence the outrage in relation to pornography (of all kinds) on the internet.

What should be done? Note that in its confusing headline, the BBC has not only muddled what is illegal and what is legal but also who is the focus of the called-for regulation:  in the case of regulating illegal content, the purpose is to prevent potential perpetrators from accessing such content; in the case of regulating legal content, the purpose – at least as recently called for in the UK media – is to prevent children themselves from accessing such content. These are entirely separate issues.

In distinguishing types of content I rely on the law because the law includes a public process of adjudication – hearings, evidence, balancing of probabilities, etc. To inform the law, we need evidence. But it is vital to understand that there will be no definitive beyond-all-reasonable-doubt evidence that (all) porn causes damage (to all exposed to it) – for good methodological and ethical reasons. But there is sufficient evidence that some porn, along with other factors, is one of the causes of some harm to some people under some circumstances – as reviewed in Harm and Offence in Media Content – hence  my call for appropriate, proportionate, transparent, independent, evidence-based action.

In the UK, the Internet Watch Foundation has the responsibility to locate and take down illegal online content (notably, images of child abuse). In its statement of 31st May, the IWF seems to say that it is doing its best, but that it relies on reports from individuals to identify potentially illegal content, and that insufficient reports are forthcoming. It seems timely to ask whether this is an appropriate mechanism for identifying illegal content, and whether the IWF or CEOP have the resources to cope.

The latest news on CEOPs website is a week old, and implies that more should be done to educate parents about the risks of child abduction. This is right but insufficient, and it’s worrying that John Carr, spokesperson for the children’s charities, judges there are too many potential offenders for CEOP to arrest.

As for content that is legal for adults, another approach is required. I suggest that online as offline, the means should urgently be found to manage the conditions by which children (according to their age, capacity, vulnerability and circumstances) may or may not access pornography (according to its type, explicitness, degree of violence or degradation). Not only must these means respect both children’s age/circumstances and the nature of the pornography but also they must not over-block legitimate and legal use by adults. This is important, and raises crucial issues of public trust.

Because of its size in the search engine market, Google has been particularly called upon to do more and show moral leadership, and the company has responded that it already has zero tolerance for illegal content. Nonetheless, there appears some doubt in the mind of the UK public – or at least in its media – regarding its effectiveness. Google’s strategy for legal pornography, by contrast, leaves everything to the user (namely to turn on ‘safe search’ via a tool many seem unaware of) and does not appear to deal with that which is legal for adults but not for minors.

More generally, I am concerned that as a society we seem to be leaving such measures to industry, which deals with pornography according to the largely unaccountable, proprietary processes of ‘customer care’ and ‘terms and conditions’. This gives no right of redress, no transparency in what is blocked or why, and no analysis of over-blocking. Previously on this blog I have argued that, to allay legitimate concerns about censorship, the UK needs a publicly accountable, independent body to control the conditions of children’s access to the internet that will be as trusted by the public as are the bodies that manage their access to television, film and computer game content. I don’t say it will be easy, but public pressure is clearly behind the attempt.

 

Note: This article gives the views of the author, and does not represent the position of the LSE Media Policy Project blog, nor of the London School of Economics.


[1] Please note that in this post it would inappropriate for me to comment on the particular circumstances of the very tragic case that has occasioned these recent discussions about pornographic content on the internet.

About the author

Sonia Livingstone

Sonia Livingstone OBE is Professor of Social Psychology in the Department of Media and Communications at LSE. Taking a comparative, critical and contextual approach, her research examines how the changing conditions of mediation are reshaping everyday practices and possibilities for action. She has published twenty books on media audiences, media literacy and media regulation, with a particular focus on the opportunities and risks of digital media use in the everyday lives of children and young people. Her most recent book is The class: living and learning in the digital age (2016, with Julian Sefton-Green). Sonia has advised the UK government, European Commission, European Parliament, Council of Europe and other national and international organisations on children’s rights, risks and safety in the digital age. She was awarded the title of Officer of the Order of the British Empire (OBE) in 2014 'for services to children and child internet safety.' Sonia Livingstone is a fellow of the Academy of Social Sciences, the British Psychological Society, the Royal Society for the Arts and fellow and past President of the International Communication Association (ICA). She has been visiting professor at the Universities of Bergen, Copenhagen, Harvard, Illinois, Milan, Oslo, Paris II, Pennsylvania, and Stockholm, and is on the editorial board of several leading journals. She is on the Executive Board of the UK Council for Child Internet Safety, is a member of the Internet Watch Foundation’s Ethics Committee, is an Expert Advisor to the Council of Europe, and was recently Special Advisor to the House of Lords’ Select Committee on Communications, among other roles. Sonia has received many awards and honours, including honorary doctorates from the University of Montreal, Université Panthéon Assas, the Erasmus University of Rotterdam, the University of the Basque Country, and the University of Copenhagen. She is currently leading the project Global Kids Online (with UNICEF Office of Research-Innocenti and EU Kids Online), researching children’s understanding of digital privacy (funded by the Information Commissioner’s Office) and writing a book with Alicia Blum-Ross called ‘Parenting for a Digital Future (Oxford University Press), among other research, impact and writing projects. Sonia is chairing LSE’s Truth, Trust and Technology Commission in 2017-2018, and participates in the European Commission-funded research networks, DigiLitEY and MakEY. She runs a blog called www.parenting.digital and contributes to the LSE’s Media Policy Project blog. Follow her on Twitter @Livingstone_S

Posted In: Children and the Media | Filtering and Censorship | Media Literacy

Leave a Reply

Your email address will not be published. Required fields are marked *