LSE - Small Logo
LSE - Small Logo

Lorna Woods

December 1st, 2022

The Online Safety Bill – Status Report

0 comments | 12 shares

Estimated reading time: 5 minutes

Lorna Woods

December 1st, 2022

The Online Safety Bill – Status Report

0 comments | 12 shares

Estimated reading time: 5 minutes

The Online Safety Bill returned to the headlines this week after the government announced some ‘improvements’ to the Bill, whose progress been on hold since July. Professor Lorna Woods of Essex University explains here the difference between the original Bill and the proposed changes, and explains what is likely to happen next.

The Online Safety Bill is returning to Parliament. While it will finish its interrupted Report Stage on 5 December based on the model introduced by the Johnson regime, the Sunak regime promises changes to the Bill to offer a “triple shield” of protection online in lieu of current protections for adults. So, what does the Bill do and how do the new proposals change things? To understand the change, a review of the original Bill is necessary. What follows is a summary of the Bill, not an assessment of whether it is good, bad or ugly.

Also note that it seems that the Bill will finish its report stage in its July 2022 version. The Government will then – in a highly unusual step – send the Bill back to committee to give MPs the chance to scrutinise amendments implementing the most significant of the Sunak-era policy changes. Then it will go to the Lords, most likely in January. Consequently, the precise detail of what is proposed is not yet available. What we do know is found in a press release from DCMS and a Written Ministerial Statement, as well as a couple of amendments already tabled.

The Bill is complex, partly because it does three distinct but interconnected things:

  • introduces a regulatory regime in relation to social media services and search services;
  • introduces age verification requirements in relation to pornography providers; and
  • introduces new criminal offences.

While it seems the pornography provisions will not be changed, the other two aspects of the Bill are to be amended.

The Original Bill

The Regulatory Regime (Part 3)

The Bill as introduced imposed obligations on two types of services: “user to user” services (essentially social media) and search. Search engines would be under less onerous obligations than user to user. The obligations were in essence structured around risk assessment requirements and obligations to mitigate risks, but these latter obligations were termed ‘safety duties’.  More detail on how to do a risk assessment and how to comply with safety duties will be found in guidance and codes produced by the regulator, Ofcom.  The duties distinguished between different categories of content:

  • content the use, possession, dissemination, publication, viewing or accessing of which would constitute a criminal offence (“illegal content”)
  • fraudulent advertising
  • content that is harmful to children (but not including anything in the illegal content category); and
  • content that is harmful to adults.

In general, there are stricter duties with regard to illegal content than content harmful to children (with rules in relation to fraudulent advertising being dealt with separately and subject to slightly less stringent rules than the rest of the criminal rules), and in turn the duties to content harmful to adults. While the illegal content duty and the harmful to children duty both require the service provider to take proportionate measures to effectively mitigate and manage risks of harm, in relation to harmful to adults, the duty is to summarise in the terms of service the findings of the most recent risk assessment.  Moreover, only a sub-set of user-to-user services (those deemed higher risk and placed in ‘Category 1’) will be subject to the duties in relation to content harmful to adults. These Category 1 providers also have obligations in relation to democratic content and journalistic content.  All services have duties in relation to freedom of expression and privacy.

Each of these content categories have a sub-set designated as ‘priority’ in relation to which the safety duties specify more detailed rules.  A Written Ministerial Statement from July identified likely categories in addition to the priority illegal categories already listed in the Bill. The distinction between the categories of content can be seen in relation to priority content too: for illegal content there is an emphasis on ensuring that illegal content is not promoted and is taken down swiftly when the provider is notified about it.  For children the rules focus on ensuring that children should not encounter priority content.

By contrast, the rules in relation to adults are essentially about transparency – the provider should tell users how it responds to priority harms to adults. This includes the possibility of telling users it chooses to do nothing.  It means that headlines dealing with the proposed changes to the Bill, such as “Plan to make big tech remove harmful content axed” are wrong, simply because that was not required in the first place.

The harmful but legal duties also include the requirement to provide ‘user empowerment tools’. As part of this general obligation in relation to priority content harmful to adults, the Bill specifies that users should be free not to interact with unverified accounts (and services should provide tools for users to verify themselves).

Providers will be required to engage in transparency reporting delivered to Ofcom, which will then provide its own general transparency report based on the information supplied.

The Pornography Regime (Part 5)

A separate section deals with pornography providers – this relates to porn created or commissioned by the service provider, rather than user generated content. It contains no content-based duties (eg there is no obligation to take down illegal content on notification) but requires the providers “to ensure that children are not normally able to encounter” pornographic content. (For more detail on how the current regime deals with pornography see here.)

The Criminal Offences (Part 10)

The criminal offences added by the Online Safety Bill are not strictly part of the regime, though they will add to the scope of ‘illegal content’. They aim to tidy up some issues with communications related offences identified by the Law Commission. This also included a cyberflashing offence.

The Amendments

Much of the Bill will remain unchanged. The key areas of change involve:

  • replacing harmful but legal with the “triple shield”
  • including further criminal offences
  • increased transparency measures.

“Triple Shield”

This comprises the following principles for protection of adults:

  • Content that is illegal should be removed.  At the time of writing, it is unclear how this fits with the existing proposals with regards to illegal content – whether it is just re-stating what is already there or whether there will be a set of amendments instead of those provisions. Hopefully the former position is the case.
  • Terms of service should be enforced, though it is unclear how much freedom services have in setting those terms of service and whether there is a baseline of issues that must be addressed (for example, the types of content the user empowerment duties (below) relate to). Insofar as the statement says “legal content that a platform allows in its terms of service should not be removed” presumably it does not cover the situation where a claimant has obtained a court order to have content taken down for violation of civil law rights even though the content might not be contrary to the criminal law.
  • User empowerment means that users should be provided “with the functionality to control their exposure to unsolicited content that falls into [certain] categor[ies]”.  Measures envisaged include human moderation, blocking content flagged by other users or sensitivity and warning screens.  The duty will specify legal content related to suicide, content promoting self-harm and eating disorders, and content that is abusive or incites hate on the basis of race, ethnicity, religion, disability, sex, gender reassignment, or sexual orientation. This would include misogyny. Notably, it does not seem to give adults the choice as to whether to engage with porn, violent material, and a whole range of other material that is subject to controls in the offline world.  Category 1 services will still need to give users the option to verify themselves and choose not to interact with unverified users. While this latter point seems unchanged, it is not clear the extent to which the user empowerment tools provisions currently in the Bill have changed.

It seems likely that these are the amendments that will be considered when the Bill goes back to Committee.

More Crimes

The Written Ministerial Statement indicates a number of offences will be introduced:

  • One that has already been tabled for consideration at Report: so-called “epilepsy trolling,” which is an offence in relation to flashing images.
  • The Government has adopted the Law Commission’s recommendation in relation to criminalising encouraging or assisting serious self-harm. Note that although there are amendments listed already seeking to make this behaviour an offence (NC 16), this is not necessarily the same as the Government proposal which the Statement confirms will be introduced in the Lords.
  • The Government will introduce in the Lords an offence covering the sharing of people’s intimate images without their consent, based on recommendations from the Law Commission as well as the campaign by Dame Maria Miller MP. Maria Miller had, indeed, tabled an amendment to deal with this issue for discussion at Report (NC 45, see also NC46, NC 47 and NC 48). There may be a question as to how wide ranging the Government’s amendment will be by comparison with the range of offences already tabled.
  • The Government will also add the controlling or coercive behaviour offence to the list of priority illegal offences, once the Bill reaches the Lords.

The weekend saw news reports of the introduction of a “down-blousing” offence as well as one covering deepfake porn, but neither was listed in this statement – these seem to fall within the bailiwick of the Ministry of Justice. Rather, the statement noted that separate from the Bill there would be “additional laws to tackle a range of abusive behaviour including the installation of equipment, such as hidden cameras, to take or record images of someone without their consent.” There were no details as to timing on this.

The Bill included two new communications offences (cls 151 and 152); it seems cl 151 will be removed.  The threatening communications offence (cl 153) will stay. Neither the Malicious Communications Act nor Section 127 Communications Act will be repealed but the overlap between them and the new false and threatening communications offences will be removed, though it is not clear how or where.

Transparency and Other Provisions

Service providers are currently obliged to report to Ofcom in their transparency reporting but not to the general public. With the exception of content harmful to adults, however, providers have not been obliged to publish the outcome of their risk assessments. Now the Government is proposing that the largest platforms should publish summaries of their risk assessments for content that is harmful to children, so as to give parents greater information about the relative safety of services their children use. Whether the obligation to publish such a summary remains in relation to content harmful to adults is unclear. It also does not make sense that summaries in relation to risk assessments for illegal content should not be required if the intention is to give users (or the parents of users) enough information to make informed choices. We will not see the drafting on this until the Bill reaches the Lords. Ofcom will be given the power to require platforms to publish details of enforcement action it takes against them (in addition to the existing provisions giving it power to publish details of enforcement action (cl 129)). This clause will be considered at Report stage (see NC51) and it seems to apply across all enforcement actions, although the press release talks about this in the context of protections for children.

More detail will also be required as to how platforms enforce minimum age limits, if specified (this presumably is a separate issue from age verification to show a platform is not accessed by children). These amendments will be tabled when the Bill returns to the Commons.

The Written Ministerial Statement also adds the Children’s Commissioner, the Victim’s Commissioner and the Domestic Abuse Commissioner as statutory consultees for Ofcom when developing guidance and codes. Given the Commissioners’ remits cover England, it might be expected that the amendments to the Bill will also include the respective commissioners in other parts of the UK.

Next Steps

The next set of amendments will start to implement these policy changes, but the most significant will be discussed when the Bill returns to Committee stage in the Commons. Given this phased roll-out of amendments, in which the Government may well seek to amend its own previous amendments, tracking progress may well prove complex. The difficulties will be compounded with the possibilities of linked measures (eg deepfake porn as well as measures on hidden cameras) being introduced through different legislative vehicles.

This article represents the views of the author and not the position of the Media@LSE blog, nor of the London School of Economics and Political Science.

About the author

Lorna Woods

Professor Lorna Woods, OBE is Professor of Internet Law at the University of Essex and a member of the Human Rights Centre. Professor Woods has received an OBE for her services to internet safety policy. Her most recent project, with Carnegie UK Trust, is on the regulation of social media, introducing and arguing for a systemic approach. This work underpinned the UK government’s approach to legislation; she has been invited to give evidence to numerous Parliamentary select committees both in the UK and abroad, and regularly presents on law and tech at policy conferences. Recent publications include "Obliging Platforms to Accept a Duty of Care" in Moore and Tambini (eds) Regulating Big Tech: Policy Responses to Digital Dominance (OUP, 2021) and a co-edited collection, Perspectives on Platform Regulation Concepts and Models of Social Media Governance Across the Globe (Nomos, 2021). Professor Woods also researches digital human rights, including a chapter on freedom of expression in Peers et al (eds) The Charter of Fundamental Rights: A Commentary (2nd ed) (Hart, 2021).

Posted In: Internet Governance

Leave a Reply

Your email address will not be published. Required fields are marked *