LSE - Small Logo
LSE - Small Logo

Blog Administrator

July 17th, 2014

Social Media Offences: About the Crime, Not the Medium

0 comments | 1 shares

Estimated reading time: 5 minutes

Blog Administrator

July 17th, 2014

Social Media Offences: About the Crime, Not the Medium

0 comments | 1 shares

Estimated reading time: 5 minutes

Mason HeadshotAmong the most egregious of social media offences is what has become known as “revenge porn” and there’s a lot more attention to this problem in the UK since former Culture Minister Maria Miller urged action on it last month. Our Jessica Mason has been following recent policy debates including a current Inquiry in the House of Lords. She argues that a thorough review of the legal framework is needed.

The House of Lords Communications Committee has been hearing oral evidence in its inquiry into social media offences. The inquiry, while not exclusively focused on “revenge porn,” takes place amidst a surge in media coverage on revenge porn in the UK, spurred by MP Maria Miller’s statements in the House of Commons and Justice Secretary Chris Grayling’s promise that the government is looking into the issue. Though the recent scrutiny-light, superfast passage of the new Data Retention and Investigatory Powers Bill appears set to complicate chances for a proper review of some aspects of it, what is needed is a thorough evaluation of the complete legal framework within which this kind of crime can, and should be dealt with.

[raw][/raw]

It is worth noting that although I will use the term because it has become popularly known, ‘revenge porn’ is not an adequate name since non-consensual intimate image distribution is not restricted to ex-lovers seeking revenge. Sexually explicit material is often doctored, stolen, or hacked. Such content is also distributed with a design to humiliate or harm as the distribution often includes personally identifiable information and threats of blackmail or assault.

Current handling of non-consensual intimate image distribution

The Committee so far sought to understand how social media offences as a whole are prosecuted under existing laws and whether those laws are sufficiently flexible to cover new technologies and means of communication. The Committee questioned law enforcement experts from the Crown Prosecution Service and the Association of Chief Police Officers, as well as representatives from Facebook and Twitter, in order to better understand how they handle harassing, threatening or grossly offensive material online and whether new laws are needed to prosecute online offences.

The ongoing political discussions led by Maria Miller and several civil society groups are not arguing that there are no laws to deal with revenge porn, but rather that revenge porn should be made a sexual offence. Evaluating the intense levels of violence against women and gender discrimination online and crimes that disproportionately target women should require a separate inquiry.

The Crown Prosecution Service prosecutes social media offences according to a set of Guidelines developed in consultation with the public, but these guidelines did not incorporate recommendations to explicitly address gender discrimination and harassment.

Are existing laws sufficient? Clarifications and improvements are needed

For the most part, those testifying to the Lords’ committee seemed to agree that no new laws are needed in the UK to prosecute social media offences, as existing laws are sufficiently flexible. The CPS representative did remark that laws could be improved to make ‘Breach of Orders’ better for the protection of children and subject to harsher punishment. Currently the Children and Young Persons Act restricts reporting on children involved in criminal prosecutions, but it does not extend to social media, just more traditional publications.  Also, the Sexual Offences Amendment Act does punish those who publish victims’ information on social media, but only with a fine, so a harsher punishment should be considered. The CPS representative indicated that either a longer statute of limitations for prosecution or speedier response times on requests for information from social media companies would aid in prosecution of these crimes.

As Chief Constable Stephen Kavanagh pointed out during his testimony, digital crimes involve new types of digital crime scenes and evidence for police, which necessitates retraining for police at all levels of the service, clarification of jurisdictions, and an integrated national approach due to the cross-jurisdictional nature of these crimes.

Appropriately Kavanaugh suggested a review of how to enable law enforcement to ethically enter digital environments to monitor and prevent crimes, but such a step might be already moot since the Parliament looks set to pass a new Data Retention and Investigatory Powers Bill. This bill, which is essentially a revival of the failed Communications Data Bill, covers the retention, interception and access by law enforcement to all sorts of data related to communications services in the UK.

What’s the role of major social media companies?

Of course anyone can easily find information on how Twitter and Facebook deal with content removal with a Google search or a few clicks around one of the sites. You can learn all about the rules on these sites, how to easily report content, and how to keep your account secure.

Having reported content before, I can testify that the system is anonymous and efficient. Within 24 hours the Facebook team got back to me with a notification on the outcome of my report (whether the content was removed or not; it was) and an option to click for more details. I can also go to the support dashboard on the left hand side of my Facebook news feed and see a history of my reports.

At the massive scale at which these sites operate (over 1.2 billion active Facebook users; 500 million tweets a day) things are not going to be perfect, and the public can and should challenge them on the content removal decisions they make, but the systems to empower users to report abuse directed towards themselves or others are there.

What is troubling about the Lords’ Inquiry is that they only chose to question Facebook and Twitter, which have systems in place, and did not focus their attention on the growing number of sites that exist for the purpose of harassment, humiliation, and sharing non-consensual images. It is unclear to many victims of these sites how they can seek content removal or redress, and such sites often laugh in the face of requests to remove content and may even blackmail victims, solicit bribes, or publish their pleas for content removal.

Flexible, technology-neutral laws needed, but so are reviews and adjustments

Harassment, blackmail, discrimination, death threats, and even revenge porn all pre-date the internet, but the scale of the net and its possibilities for anonymity allow these crimes to be more harmful than ever before.

Those testifying to the committee were right: we cannot start making laws tied to technology or the means by which crimes are committed. Such laws will become antiquated quickly. Rather than reactionary law-making that might not actually address problems while risking compromises to privacy rights and freedom of expression, a full review of the legal framework should be conducted including the penalties for crimes already on the books. If the viral nature of social media creates greater harm and humiliation for a victim, sentencing should be harsher. Finally, offences may need to be re-categorized, as in the case of making revenge porn a sexual offence, to fully reflect the nature of the crime. We need to address the crime committed not the means used, but by the harm it caused the victim or the public.

This post gives the views of the author, and does not represent the position of the LSE Media Policy Project blog, nor of the London School of Economics.

About the author

Blog Administrator

Posted In: Internet Governance | Privacy

Leave a Reply

Your email address will not be published. Required fields are marked *