The UK Government’s Online Harms White Paper promises a new culture of transparency, trust and accountability. Writing with Thorsten Brønholt, LSE Visiting Fellow Jonny Shipp highlights developments in UK Government thinking and invites Internet firms to collaborate on the Internet Commission’s independent evaluation framework, which looks at how they are tackling illegal content, hate speech, cyberbullying, self-harm and fake news on their platforms.
Digital revolution: time for a re-think
As it celebrates its 30th anniversary the Internet revolution has lost its way. From democratising and emancipatory beginnings, the information superhighway has become a hotbed of crime, harm and offence, powered by designed addition and fake news. Unparalleled connectivity, automation and efficiency, have left people around the world scrambling to adapt to technology’s disruptive impact on everyday life and institutions of business and government. Five giant evils of confusion, cynicism, fragmentation, irresponsibility and apathy have come to characterise today’s information crisis.
Voices from all walks of public life have started to insist on more effective governance of online affairs, to counter the worst excesses of the commercial forces that shape this new world. The Internet giants are now themselves calling for regulation and have become interested in a sustainable and more human-centric Internet. Claiming a failure of self-regulation, UK policymakers now propose a statutory duty of care for social media platforms as part of “ambitious plans for a new system of accountability and oversight for tech companies”.
Transparency and accountability
The Internet Commission’s expert team is piloting independent evaluation of how companies are tackling illegal content, hate speech, cyberbullying, self-harm and fake news on their platforms. A year ago we set out a mission to increase transparency about how online content is managed in order to help firms to improve their practice and legislators to sharpen their focus. This is a first step in advancing digital responsibility, our broad definition of which is largely aligned with the very wide scope of the UK Government’s White Paper.
Over the last year we shared some of our insights with UK Government, and very much welcome its attention to procedural accountability, including attention to the right of appeal, and its recognition of the diversity in size and type of digital organisation involved. In our discussions we highlighted the growing importance of artificial intelligence in the range of resources applied to the task of content moderation at scale. Because of this we see a need for a renewed emphasis on the balance of safety and freedom of expression.
Our main conclusion so far: it will be more helpful to focus on accountability than transparency. This acknowledges a long-argued position that transparency, and the destruction of secrecy, is no magic formula for rebuilding trust. We believe that it must be possible to enable more widespread understanding of the progress firms are making to tackle online harms, whilst navigating the fact that the Internet is mostly privately owned. Benchmarking operational information can facilitate learning and improvement. Whilst respecting confidentiality, expert intermediaries can facilitate this benchmarking and also work with firms to provide stakeholders with a fair and independent evaluation of progress on digital responsibility issues.
Calling pioneers and first movers
The Internet Commission is working confidentially with Internet and digital businesses to build insight and enable balanced and independent reporting of progress and key challenges. In the context of emerging regulation in Europe and beyond, this is a big opportunity for pioneering firms to get ahead and help to shape the new regulatory environment.
New Internet regulation is definitely coming, but it will still be some years in the making. This White Paper is one of a series of steps from national governments and international institutions that will establish new laws to create a better Internet. This process will likely reshape expectations of corporate accountability and digital responsibility: I believe that the winning Internet firms will be those that seek to collaborate, entering a process that helps to shape the future.
Our first independent evaluation framework focuses on content moderation at scale. In its development we considered and incorporated key aspects of the Santa Clara principles, and have consulted with industry, civil society, and political institutions such as the European Commission, the United Nations, and Government departments and agencies in Australia and the UK. We aim to publish this framework (our questions) by the summer and to follow with a first report (evaluation of practices) by early 2020. Please contact me if you would like to explore taking part.
This article represents the views of the author, and not the position of the LSE Media Policy Project, nor of the London School of Economics and Political Science.