LSE - Small Logo
LSE - Small Logo

Blog Administrator

August 23rd, 2018

The essential elements of the new Internet governance: diversity, optimism and independence

0 comments

Estimated reading time: 5 minutes

Blog Administrator

August 23rd, 2018

The essential elements of the new Internet governance: diversity, optimism and independence

0 comments

Estimated reading time: 5 minutes

In May 2018, UK Government set out its plan to publish a Code of Conduct and develop proposals to legislate on Internet safety. In July 2018 the Internet Commission’s Dialogue on Digital Responsibility convened leading thinkers and civil society advocates to debate related proposals from others about transparency reporting, an accountability framework and the idea of a duty of care for social media platforms towards their users. LSE Visiting Fellow Jonny Shipp discusses the take-aways from the workshop and elaborates on the long-term challenges that policymakers now face.

Policymakers and the Silicon Valley giants are engaged in a battle over freedom and regulation. The Internet Commission is catalysing a new, positive cycle of organisational accountability, transparency and multi-stakeholder dialogue. The Dialogue on Digital Responsibility brings policymakers, academics and activists together with online firms to make progress on digital responsibility and accountability.

One of the first meetings in this dialogue was convened in July 2018. Leading UK and international NGOs were invited to discuss emerging approaches to Internet regulation. Proposals were discussed about transparency reporting, legislating for platform accountability and the idea to impose a duty of care on Internet platforms towards their end users. By posing specific questions we identified some key areas of agreement, challenge and disagreement among civil society stakeholders. Based on this we prepared this briefing note setting out our conclusions. It includes the following table as an overview.

Platform accountability

Mark Bunting, a member of Communications Chambers and a visiting associate of the Oxford Internet Institute, sets out his thinking about platform accountability to consider how to legislate for it in his recent paper, Keeping Consumers Safe Online: Legislating for platform accountability for online content. He argues that lawmakers could establish statutory requirements for Internet platforms to clarify how they will handle harmful and illegal content and demonstrate how they take a fair and reasonable approach to balancing the fundamental rights of freedom of expression, respect for privacy, dignity and non-discrimination, protection of intellectual property and the right to conduct business.

While Internet platforms may have increasingly sophisticated processes to manage or regulate content online, there are so far no systematic means for governments, civil society or the platforms themselves to know the impact of these processes and account for them. The Internet Commission’s work on transparency reporting is relevant here. Its accountability model maps the key areas of social impact and provides a framework for the development of key metrics or performance indicators, with a focus on content management processes. Based on these indicators, transparency reporting will enable digital organisations to demonstrate accountability and best practice whilst also supporting evidence-based public policy development.

The argument for a duty of care

In their project for the Carnegie UK Trust, Professor Lorna Woods and Will Perrin argue that lawmakers must break away from any approach to content regulation that frames Internet platforms as publishers. The online spaces of social media should instead be understood as quasi-public spaces analogous to workplaces or buildings. In the workplace, it is the duty of employers, “so far as is reasonably practicable,” to ensure the health, safety and welfare of workers, ensuring that the materials and tools they must use are safe (Health and Safety at Work Act 1974). Visitors to buildings are similarly protected by a duty of care to ensure that they are “reasonably safe in using the premises for the purposes for which he is invited or permitted by the occupier to be there” (Occupiers Liability Act 1957).

Woods and Perrin describe a co-regulatory system in which a regulator would collaborate with civil society and industry to establish a virtuous circle of harm reduction in social media: harms would be measured and surveyed, plans agreed and implemented to prevent the most significant harms, followed by further measurement and improvements. A key challenge for legislators is that action probably needs to be based on risk assessments rather than conclusive evidence of harm. Some at the Internet Commission’s July workshop argued that this approach is not compatible with a human rights law approach to protecting freedom of expression. The implications for intermediary liability also need to be better understood.

A new culture of Internet governance

Beyond the discussion about possible legislative approaches, three important points were made about the culture of today’s internet governance institutions. First, that the organisational culture of existing regulators in the UK is not up to the task of dealing with strong, successful, rich, global firms on the one hand and a confused public and a collection of single-issue-based civil society groups on the other. Second, the current multi-stakeholder model for Internet governance was criticised because it does not provide the means – adequate and sustained funding – for proper, considered and expert representation of civil society interests. Some also think that there is insufficient technical expertise in policy debates. Third, there was consensus about the need for independent oversight, independent of government and of industry.

Diversity, optimism and independence

The Internet Commission believes that to successfully reverse today’s negative spiral of unintended consequences, ad hoc regulation and loss of public trust in digital services, the required renewal and strengthening of Internet governance will need to be characterised by diversity, optimism and independence.

Despite appearances, the opportunity of the Internet to support an open and cosmopolitan way of life is frustrated by fact that our media surrounds us with the perspectives of people like us. The new Internet governance must help to reshape our news, social media and cultural media to promote diversity and intercultural understanding by better enabling chance encounters with the unfamiliar (Zuckerman, 2013).

Lawmakers and the institutions they shape must balance risks and opportunities. The opportunities which have until recently been quite uncritically received from Silicon Valley must be unpacked, debated, understood and more proactively shaped by policymakers. It is at least as important for citizens and society to successfully direct the potential of new technologies as it is for them to tackle the unintended negative consequences of digitalisation. A shared optimism about the positive potential of digitalisation should legitimise the explicit shaping of digital development by capital markets and public administrations (Mazzucato, 2017).

With governments inclined towards “regulation by outrage” and the tech sector on the run, it is for capital, intergovernmental, academic and civil society institutions to collaborate to turn today’s battle between policymakers and the Silicon Valley giants into a new settlement for the future. Their independence from the short-term perspectives of both governments and industry opens the possibility of them shaping a new model of Internet governance.

About the Internet Commission

Backed by an advisory board from industry, academia and civil society, I am leading the development of the Internet Commission together with Julian Coles, Jessica Sandin and Dr Ioanna Noula. We are working to turn our accountability model for digital organisations into action by leading a dialogue on digital responsibility and developing a transparency reporting framework.

This post gives the views of the author, and is not the position of the Media Policy Project nor of the London School of Economics. 

About the author

Blog Administrator

Posted In: Intermediaries | Internet regulation | LSE Media Policy Project | Truth, Trust and Technology Commission

Leave a Reply

Your email address will not be published. Required fields are marked *