LSE - Small Logo
LSE - Small Logo

Blog Administrator

February 13th, 2018

A quiet revolution: the Digital Charter is an opportunity to strike a new deal with online platforms

0 comments | 1 shares

Estimated reading time: 5 minutes

Blog Administrator

February 13th, 2018

A quiet revolution: the Digital Charter is an opportunity to strike a new deal with online platforms

0 comments | 1 shares

Estimated reading time: 5 minutes

It is a busy time for anyone interested in media policy developments here in the UK. In this post, Mark Bunting, a member of Communications Chambers and a visiting associate of the Oxford Internet Institute, gives his take on the Digital Charter that was announced recently by the UK Department for Digital, Culture, Media and Sport (DCMS), arguing that we need a new deal with platforms such as Facebook, and setting out how he thinks the Charter could be developed. 

Away from the Brexit spotlight, a quiet revolution is unfolding in UK digital policy. For years, the conventional wisdom was that the Internet couldn’t, and shouldn’t, be regulated. Now, with online platforms in the dock for everything from hate crime to the future of news, the Government is acting. Its strategy could have profound implications for online platforms, the open Internet, and how a balance is struck between online freedoms and protection from harm.

Theresa May has repeatedly urged social media companies to do more to stamp out online abuse, block extremist content and protect children. This week she outlined plans, first trailed in October’s Internet Safety Green Paper, to encourage them to do so, including an annual internet safety transparency report, easier ways for people to report bullying and harmful content, and a social media code of practice. Momentum is gathering. This week’s speech follows last week’s release by DCMS of its ‘Digital Charter’, and was followed by the launch of a review of press sustainability, with “the role and impact of online platforms” squarely in the frame.

The Charter’s ambition – to “set new online standards for years to come…[and] agree norms and rules for the online world and put them into practice” – is bold. To date, details are scant. But in some areas, the approach is becoming clearer. Most significantly, the Government intends to look at online platforms’ legal liability for content shared on their sites. This raises questions about established Internet law that in Europe dates back to the 2000 E-Commerce Directive, which exempts online platforms from liability for illegal content they unknowingly host.

The significance of this step is clear. It’s right that we look to platforms to help address the risks of deeply unpleasant content. But treating them like publishers means shoehorning new businesses into legal frameworks from another technological age. Liability exemption allowed open platforms to flourish, enabling millions of ordinary people, creative innovators and small businesses the ability to communicate, share stories, and reach a global market. Asking platforms to discriminate between political speech and incitement to violence, or alternative opinion and fake news, or legitimate criticism and abuse, means giving them even greater power, and outsources complex legal and ethical judgements to private companies. Their decisions will inevitably be controversial; no technology, and no army of online moderators, will always get the balance right between free expression and hate speech. Context is everything, as reports that YouTube deleted Syrian activists’ evidence of war crimes demonstrate.

The challenge now is the same as it has been since the dawn of the Internet: to balance the benefits of openness with a legitimate, and clearly defined, role for platforms in managing its risks. We should encourage a cautious approach, not throw ever more problems at technology firms based on unproven assumptions about algorithms’ ability to solve them.

Germany’s Network Enforcement Law shows the risks of hasty legislation. The so-called ‘Facebook law’ threatens fines of up to €50m if platforms do not take down hate speech and other kinds of illegal content within 24 hours of being notified of it; it incentivises platforms to shoot first, ask questions later. Barely a month since it came into full effect, Angela Merkel has already said that changes may need to be made, after concerns emerged of over-zealous content removal, including posts from politicians and satirical news sites.

We need a new deal with platforms, that encourages a responsible approach, in which proportionate efforts to address online harms are balanced with the benefits of open online markets.

So where could the Digital Charter go next? First, it should spell out principles of good governance – in other words, how we expect platforms to run their services and govern their communities. This would include standards of transparency and accountability, recourse and appeal. Platforms should be incentivised to publish, and regularly update, plans regarding harmful and illegal content, including objectives, targets and ways of measuring progress. They should be encouraged to follow principles of ‘accountable design’ in developing their algorithms, and commission independent evaluation of their impacts. The forthcoming social media code of practice provides an opportunity to clarify these expectations.

Second, it should create an independent forum for systematic collaboration between Government, regulators and platforms – an Online Standards Forum – not to regulate platforms, but to work with them to set priorities across the wide range of policy areas and Government departments that involve platforms. The Forum would help establish appropriate governance by platforms, and publish assessments of the impact of platform policies. It could be a new body, link to an existing regulator like Ofcom, or form part of the planned Centre for Data Ethics and Innovation, for which funding was provided in the last Budget.

Third, the Charter should confirm that platforms will not be made liable for the content they host. But the Government could explore the possibility of a new quid pro quo, in which continued liability exemption is linked to major platforms having robust policies on harmful and illegal content – that is, policies that are developed, implemented and enforced consistently with principles of good governance. A voluntary approach is preferable, being more flexible; and more clarity about expectations might be enough to encourage platforms to develop more accountable governance. But in case a voluntary approach proves insufficient, legislative options should be examined.

Karen Bradley, Matt Hancock’s predecessor as DCMS Secretary of State, was rightly keen to avoid legislation. But even if liability exemption is retained, a new deal with platforms may require new institutions, with new capabilities and new responsibilities. This may also prompt questions about whether the extensive regulatory regimes that govern broadcast content are still necessary and proportionate. The existing legal framework for content regulation dates back to the 2003 Communications Act, which barely mentioned the Internet. If I were a DCMS lawyer, I’d be sharpening my pencils, just in case.

Mark Bunting is a member of Communications Chambers and a visiting associate of the Oxford Internet Institute.

This post represents the views of the author and not those of the Media Policy Project or the LSE.

About the author

Blog Administrator

Posted In: Algorithmic Accountability | Intermediaries | LSE Media Policy Project

Leave a Reply

Your email address will not be published. Required fields are marked *