As it celebrates its 30th anniversary the internet revolution has lost its way. From democratising and emancipatory beginnings, the information superhighway has become a hotbed of crime, harm and offence, powered by designed addition and fake news. Unparalleled connectivity, automation and efficiency, have left people around the world scrambling to adapt to technology’s disruptive impact on everyday life and institutions of business and government. Five giant evils of confusion, cynicism, fragmentation, irresponsibility and apathy have come to characterise today’s information crisis.

Voices from all walks of public life have started to insist on more effective governance of online affairs, to counter the worst excesses of the commercial forces that shape this new world. The internet giants are now themselves calling for regulation and have become interested in a sustainable and more human-centric internet. Claiming a failure of self-regulation, UK policymakers now propose a statutory duty of care for social media platforms as part of “ambitious plans for a new system of accountability and oversight for tech companies”.

Transparency and accountability

The Internet Commission’s expert team is piloting independent evaluation of how companies are tackling illegal content, hate speech, cyberbullying, self-harm and fake news on their platforms. A year ago we set out a mission to increase transparency about how online content is managed in order to help firms to improve their practice and legislators to sharpen their focus. This is a first step in advancing digital responsibility, our broad definition of which is largely aligned with the very wide scope of the UK government’s Online Harms White Paper.

Over the last year we shared some of our insights with UK government, and very much welcome its attention to procedural accountability, including attention to the right of appeal, and its recognition of the diversity in size and type of digital organisation involved. In our discussions we highlighted the growing importance of artificial intelligence in the range of resources applied to the task of content moderation at scale. Because of this we see a need for a renewed emphasis on the balance of safety and freedom of expression.

Our main conclusion so far: it will be more helpful to focus on accountability than transparency. This acknowledges a long-argued position that transparency, and the destruction of secrecy, is no magic formula for rebuilding trust. We believe that it must be possible to enable more widespread understanding of the progress firms are making to tackle online harms, while navigating the fact that the internet is mostly privately owned. Benchmarking operational information can facilitate learning and improvement. While respecting confidentiality, expert intermediaries can facilitate this benchmarking and also work with firms to provide stakeholders with a fair and independent evaluation of progress on digital responsibility issues.

Calling pioneers and first movers

The Internet Commission is working confidentially with internet and digital businesses to build insight and enable balanced and independent reporting of progress and key challenges. In the context of emerging regulation in Europe and beyond, this is a big opportunity for pioneering firms to get ahead and help to shape the new regulatory environment.

New internet regulation is definitely coming, but it will still be some years in the making. This white paper is one of a series of steps from national governments and international institutions that will establish new laws to create a better internet. This process will likely reshape expectations of corporate accountability and digital responsibility: we believe that the winning internet firms will be those that seek to collaborate, entering a process that helps to shape the future.

Our first independent evaluation framework focuses on content moderation at scale. In its development, we considered and incorporated key aspects of the Santa Clara principles, and have consulted with industry, civil society, and political institutions such as the European Commission, the United Nations, and government departments and agencies in Australia and the UK. We aim to publish this framework (our questions) by the summer and to follow with a first report (evaluation of practices) by early 2020. Please contact me if you would like to explore taking part.



  • This blog post appeared first on LSE Media Policy Project.
  • The post gives the views of its author, not the position of LSE Business Review or the London School of Economics.
  • Featured image by geralt, under a Pixabay licence
  • When you leave a comment, you’re agreeing to our Comment Policy.

Jonny Shipp is the founder and project lead at the Internet Commission, an independent initiative for a more transparent and accountable internet. He is also a visiting fellow at LSE’s department of media and communications. He is particularly interested in the questions: What drives digitalisation? How are connectivity, openness, security, trust and entrepreneurship interacting to reshape everyday life? What is a better digital life? Which public policies will deliver this, in particular in Europe and Latin America?

Thorsten Brønholt is a project manager turned academic. He is a doctoral candidate at the University Of The West Of Scotland, researching how digital platforms shape and reshape us as individuals and societies. Thorsten holds a M.Sc. and a B.Sc. in Political Science from the University of Copenhagen. He has worked in project management (digitalisation) in the municipality of Copenhagen, in telecommunications management, in IT, and once upon a time he co-founded a short-lived Berlin based IT start-up. As an academic, Thorsten is particularly interested in ethics and philosophy of the digital age.