LSE - Small Logo
LSE - Small Logo

Charlie Beckett

March 16th, 2018

Will veiled threats from the EU tackle the disinformation problem or can the UK get tough alone?

0 comments

Estimated reading time: 5 minutes

Charlie Beckett

March 16th, 2018

Will veiled threats from the EU tackle the disinformation problem or can the UK get tough alone?

0 comments

Estimated reading time: 5 minutes

The EC high level group on ‘fake news’ has done what it was asked to do: it has set out a problem definition and mapped out some policy principles and broad recommendations.

This article is by LSE Truth, Trust and Technology commissioner Dr Damian Tambini (LSE)

The Report

There are no particular surprises. The key points of the report are:

  • A narrowing of the definition of the problem. The term ‘fake news’ is jettisoned and the problem to be addressed is deliberate disinformation.
  • A brief overview of existing industry attempts to deal with deliberate disinformation online, such as fact checking and flagging.
  • A call for a multi-stakeholder code of practices. Details are sketchy, but this looks like a familiar process whereby public authorities use moral persuasion to bring together multiple industry players in order to solve societal problems.
  • Call for more support for media literacy.
  • Support for more research, coordination of research and more transparency.

The report endorses the UN Special Rapporteur’s warning about the dangers of policy-making in this area. Both state and private censorship are identified as potential problems, but the thrust of this paper is to avoid policy solutions that could constitute or exacerbate state censorship.

Veiled Threats?

Like many attempts to encourage self-regulation, all of this is wrapped into a veiled, and ultimately not particularly credible threat:

“as a second step, the commission is invited to re-examine the matter in spring 2019, and decide, on the basis of an intermediate and independent evaluation of the effectiveness and efficiency of these matters, whether further actions should be considered for the next European Commission term. This may cover options for additional fact-finding and/or policy initiatives, using any relevant instrument, including competition instruments or other mechanisms to ensure continuous monitoring and evaluation of the implementation of the code.” (p6)

By including this in the report, the high level group has, in effect admitted that there is an element of doubt regarding whether moral persuasion and self-regulation alone can deal with this issue. Given the complexity of the task, the high level group is correct to point this out. They are proposing a multi-Stakeholder code across multiple industry sectors, involving fierce competitors, in costly processes of self-restraint, across multiple European countries. This is a big ask, so industry might understandably be wondering: what if we do nothing? How credible is this threat of review and new measures after a year?

The high level group, as an independent body cannot issue such a threat with any credibility. The real issue is whether this now will be endorsed in full by the Commission, and whether the key recommendations and the proposed process of review, will be put into practice. The report refers to an expected EC Communication. This would be a non-binding piece of EU policy: the ‘softest’ instrument available, and it cannot prescribe legislative change. Clearly the approach will be to set out some general principles and objectives for multi-stakeholder co-operation during this Commission, with the option of delivering on the vague veiled threats with tougher measures during the next.

Brexit

All of this makes for some interesting reflection in the context of Brexit. Just hours after the publication of the high level group report, Matthew Hancock, Secretary of State for Digital, Culture Media and Sport addressed the UK Parliament Inquiry on Fake News. He was quite clear that he wished to reserve the right of the UK Parliament to reverse the e-commerce directive liability exemptions for social media including social media platforms, in the context of the debate about fake news and dis-information. So in this context perhaps we can see the prospect for fork in the road. The UK could, in theory choose to take a stricter approach to online intermediaries than would be permitted in European Law.

Many of the recommendations of the high level group would require significant new funding. For example establishment of the state-level monitoring and a centre of excellence coordinating research across the European Union. It is doubtful whether others –such as real transparency- could be delivered without legislation. The report (p6) specifically calls on EU and national authorities to step up actions, and there are multiple calls for public authorities, media organisations and academic institutes to provide support. In a situation of severe constraints on public spending, and news organisations failing daily, this will depend not only on political will in member states, but on hard cash.

To sum up: the HLG sets out good principles and the starting gun on the debate, but the real detail remains to be done and will take place at the member-state level. MS governments will wait for an EC recommendation on the topic, which could signal a new epoch of a deeper, more European media policy led by Brussels. Britain will go it alone, and there might be a debate here about more, rather than less internet censorship. The ‘fake news’, or rather the disinformation debate is not going away.

This article is by LSE Truth, Trust and Technology commissioner Dr Damian Tambini (LSE) On Twitter as @DamianTambini

 

About the author

Charlie Beckett

Posted In: Featured | T3 | Tech