LSE - Small Logo
LSE - Small Logo

Lee Edwards

October 11th, 2021

Can the Online Safety Bill be more than a toothless tiger (or a Facebook Flop)?

0 comments | 10 shares

Estimated reading time: 5 minutes

Lee Edwards

October 11th, 2021

Can the Online Safety Bill be more than a toothless tiger (or a Facebook Flop)?

0 comments | 10 shares

Estimated reading time: 5 minutes

In light of the recent revelations about Facebook’s conduct which have led to a whistleblower speaking to the US Congress, LSE’s Lee Edwards considers the implications for the UK government’s draft Online Safety Bill, which is currently undergoing pre-legislative scrutiny.

The past fortnight has seen the release of the Facebook Files – the result of a Wall Street Journal investigation into claims by Facebook employees about company practices that prioritise profit over societal good, the US over other societies, and subscriptions over preventing harm. Questions about the unregulated power of online behemoths like Facebook have been on the political, social and cultural agenda in many countries for several years. The Files illustrate why such questions are critically important when thinking about our futures in increasingly digitised societies.

In the UK, the government is addressing the power of online service providers through the Online Safety Bill. It is designed to monitor the fulfilment of corporate promises and ensure providers are accountable for their actions. The Bill could act as an important brake on corporate power. But the Facebook Files illustrate how its current flaws could equally result in it being little more than a toothless tiger, missing its regulatory target.

The draft Bill is currently being scrutinised by a Joint Pre-legislative Scrutiny Committee (see Committee Chair Damian Collins MP’s Infotagion Podcast for concise and informative summaries of their deliberations so far). In September, LSE made a submission to the Committee based on the views of academics, legal experts and practitioners, gathered during a briefing held in July.  The submission raises a number of points but in the context of the Facebook Files, three are essential for the Committee to address.

  1. Addressing collective harm. The draft Bill defines harm as individual harm. Content that directly results in harm to an individual must be removed from online providers’ services, and the providers must have in place strategies to protect users from such harms. Yet, many of the harms that online providers cause are exercised at a societal level. Individual and societal harms are connected, but the latter are harder to trace to a specific type of content. Moreover, the effect of content such as hate speech, anti-vax content, and climate misinformation is difficult to measure, but as whistleblower Frances Haugen has argued in the case of Facebook, there is no denying that the mechanics of algorithmic technology and the structures of online platforms facilitate increased reach and impact of this kind of content, potentially harming our social fabric and the quality of democratic engagement. An Online Safety Bill that does not try to address these effects will only ever have a limited capacity to counter platform power.
  2. Clarifying responsibility for regulatory practice. The draft Bill shifts the power to take regulatory action to the private sector rather than government. Online providers are required to take action against freedom of speech infringements, content that results in online harm, and must protect journalistic and ‘democratic’ content. Quite apart from the difficulty of defining these types of content, the wisdom of enabling the private sector to act as a regulator is questionable. Decisions to remove content can be fraught, and there is no clear line between content that is illegal and content that constitutes free speech. The responsibility for removal cannot be taken lightly, and Mark Zuckerberg himself has already suggested that it is not always welcome. The same challenges would face independent regulators, but they are formally accountable to government and ultimately to the public, so their actions are underpinned by a degree of societal legitimacy. Private companies are not subject to the same public accountability. Moreover, and as the Facebook Files demonstrate, simply asking a company to act in the public interest does not remove its need to realise private sector goals. The Bill’s current form only exacerbates this tension.
  3. Citizenship and digital rights. Perhaps the most fundamental issue with the Bill is its lack of attention to the contextual issues of citizenship and digital rights. As participants at our briefing argued, a rights-based approach to the Bill would set out the kinds of expectations that the public should have of online providers, against which the Bill’s provisions could be both framed and evaluated. Equally, including the public as citizens rather than users in the Bill would emphasise their rights not only as members of a democracy, but also their membership of a wider collective, and their relationship to each other. Yet, the Bill effectively atomises the public by presenting them as individual users of services, or – in the case of the Bill’s media literacy provisions – as consumers who need to be educated into greater awareness and prevention of online harms. Rights are nowhere to be seen. The result is that online providers need only worry about their relationship with ‘users’, and not with society as a whole. Their licence to operate becomes based on a contractual exchange, with conditions set out in terms that they have drafted, and into which the public has had very little input.

The Bill does make provision for a regulator (Ofcom) that will scrutinise providers’ performance and reporting. In this, it lays the groundwork for revelations like the Facebook Files to be pre-empted through regular monitoring and reporting. The government also has a significant ability to influence regulation via the auspices of the Secretary of State. Both actors represent the public interest to some extent. But in the end, what matters for legislators, for online providers and for the public is what ends up written into the final Bill. It is an ambitious piece of legislation and the problems with it are hardly surprising. But without change in the areas noted above, it risks becoming a moderating influence at best, and a doorway to even greater platform power, at worst.

This article represents the views of the authors and not the position of the Media@LSE blog, nor of the London School of Economics and Political Science.

About the author

Lee Edwards

Lee Edwards is Professor of Strategic Communications and Public Engagement in the Department of Media and Communications at LSE, where she also serves as Director of Graduate Studies and Programme Director for the MSc Strategic Communications. She teaches and researches public relations from a socio-cultural perspective, focusing in particular on how power operates in and through public relations work.

Posted In: Internet Governance

Leave a Reply

Your email address will not be published. Required fields are marked *