LSE - Small Logo
LSE - Small Logo

Garfield Benjamin

September 23rd, 2020

Gaps in UK regulation of online platforms make it difficult to tackle systemic issues – here are some ways we can fix this

2 comments | 16 shares

Estimated reading time: 5 minutes

Garfield Benjamin

September 23rd, 2020

Gaps in UK regulation of online platforms make it difficult to tackle systemic issues – here are some ways we can fix this

2 comments | 16 shares

Estimated reading time: 5 minutes

How best to regulate online search and social media platforms is under deliberation in the UK, as well as across Europe and beyond. Garfield Benjamin, a postdoctoral researcher at Solent University, explains some of the key ideas he puts forward in his new policy report on regulating privacy and content online, aiming to help both regulators and citizens think bigger.

Regulating online platforms has proven difficult, and not only due to the overused idea that technology moves faster than law. It often boils down to an unwillingness by policy-makers to tackle the underlying systemic issues of big tech, exacerbated by active lobbying by the tech industry, and the myth that ‘innovation first, regulation later’ will ever work.

Furthermore, a deeper issue stems from the different natures of regulation and online platforms. Even as incremental progress has been made in regulation of very specific issues or contexts – such as data protection in the GDPR or certain types of content in the proposed online harms regulation – the boundaries between privacy and online content have been broken down in the way platforms operate.

Our personal data is now tightly interwoven with the way we view content online. From targeted ads to search patterns to social media to news to political content, what tech companies know about us defines what they show us. In order for regulation to properly tackle these issues separately, and the massive inequalities and harms they cause when combined, regulators need to be able to cut across issues and make real changes in the way online platforms work. We need to stop seeing privacy and online content as two separate issues.

I recently released a policy report – Digital Society: Regulating privacy and content online – that mapped out some of the gaps in UK regulation, gathered public perspectives around support for increased regulation, examined the key underlying issues, and proposed seven recommendations for broader and more effective regulation. The current regulatory environment is a messy patchwork involving overlaps, contradictions and holes. This involves a number of regulators and government departments – ICO, OfCom, ASA, CMA, CDEI, DCMS, and many other acronyms. While some mechanisms exist for inter-regulator cooperation – like the cross-Whitehall group assembled to look specifically at artificial intelligence – they haven’t empowered action at the scale needed to combat the real and potential harms of Facebook, Google, Twitter and others.

There is broad public support for this kind of action. I commissioned surveys of a representative sample of the UK public* which found a strong case for more integrated regulation. There is widespread support for greater regulation of the use of personal data online (73%), fake news online (75%) and hate speech online (71%). Trust in platforms is low, and 67% of people showed support for regulating online privacy and content with the same set of laws and oversight bodies. People are concerned about these issues, don’t feel enough is being done by platforms or governments, and feel that users should have more influence.

There is optimism as well, however. While only a quarter of people think the Internet currently reduces inequality, and there is widespread awareness of the potential and real harms of online platforms, half of the respondents still thought that the Internet could be (re)designed in ways that reduce inequality. The public want action, but regulators need the resources and political support to make these changes.

Underpinning these problems are a few key issues, but the current regulatory framework is not equipped to address them. I have already mentioned the need to regulate more widely, but there is often resistance to creating another new regulator. We need to find other ways of bringing together the laws and regulators we already have. This includes greater regulation of how these technologies are designed, to prevent harms and promote justice, rather than just issuing fines after the fact.

Exclusion, and lack of trust, are key components. Few people feel represented in the tech industry (or politics and the media for that matter), and this has knock-on effects in the unequal ways technology and policy are designed and deployed. There is a need for greater digital literacy, not just in practical terms but increasing awareness, critical thinking and community support around how platforms operate. We need to move on from thinking about privacy and technology in terms of individual rights and property, and instead think about identity, power, collective action, and changing the norms and expectations of information online.

To create the regulatory frameworks for this to happen, my new report proposes seven recommendations to empower regulators and citizens to take bigger action:

  1. Regulate privacy, data and content online together by establishing an Office for Digital Society as a formal mouthpiece to bring relevant existing regulators together;
  2. Build regulation on principles linked to rights by placing equity, diversity, dignity and justice at the centre of policy;
  3. Provide a platform for representation by involving affected communities in policy and regulation;
  4. Give regulators meaningful powers and the resources to exercise them by ensuring the necessary funding, expertise and ability to effect change;
  5. Strengthen design-side regulation by taking a more proactive approach in recommendations, regulation and requirements;
  6. Promote public understanding by expanding practical, critical and participatory skills across education, industry and government;
  7. Plan for future development by building a clear path for adapting and expanding the remit of the Office for Digital Society.

A recurring theme in this new study – and broader research into online platforms – is that the way privacy and online content currently work amplifies existing inequalities and creates new ones. Fining companies for specific breaches or writing self-regulated codes of good practice is not enough. Systemic change is needed. Particularly in times of crisis and isolation, online platforms take the role of (often defunded) public services, a conscious move in their desire to dominate access to information. It’s time they took this responsibility seriously and it’s time for regulators to take serious action on the systemic issues with online platforms.

* All figures, unless otherwise stated, are from YouGov Plc. For the first survey, total sample size was 2,014 adults. Fieldwork was undertaken between 22nd – 23rd January 2020. For the second survey, total sample size was 2026 adults. Fieldwork was undertaken between 17th – 18th February 2020. The surveys were carried out online. The figures have been weighted and are representative of all GB adults (aged 18+).

This article represents the views of the author, and not the position of the Media@LSE blog, nor of the London School of Economics and Political Science.

Featured image by Deniz Fuchidzhiev from Unsplash

About the author

Garfield Benjamin

Garfield Benjamin is a postdoctoral researcher at Solent University. His research spans cultural theory and creative media practice, focusing on the relation between humans and (digital) technology. His work emphasises the future of society, whether that be distant or imminent, in order to inform our understanding of the present.

Posted In: Internet Governance

2 Comments