29 Jun 2023

The UK’s Online Safety Bill: notes on Committee stage and the final stretch

After six weeks of debate at Committee stage in the UK’s House of Lords, the Online Safety Bill—a landmark and controversial piece of legislation, which will impose a wide-ranging regulatory framework on online platforms operating in the UK—is nearing its final form.

During Committee stage, members of the Lords undertook a line-by-line examination of the Bill, and debated and voted on proposed amendments. While these discussions understandably focused on the UK context, the Bill, once passed, will become part of a wider platform regulation ‘canon’—alongside the EU’s Digital Services Act, Australia’s Online Safety Act, Ireland’s Online Safety and media Regulation Act, and Singapore’s Online Safety Act—and will set a precedent for platforms operating globally. Decisions made by the House of Lords will therefore have ramifications for other ongoing regulatory proposals, such as the ‘Fake News Bill’ in Brazil, Nigeria’s Draft Code of Conduct for Internet Intermediaries, the Digital India Act, emerging frameworks in Canada and New Zealand, as well as potentially influencing discussions on global processes such as the UNESCO guidelines.

Below, we summarise key developments from the Committee stage discussions, as well as looking at how the Bill shapes up from a human rights perspective as it enters the final stages of development.

 

Regulatory burden for business and tech platforms 

Many members of the Lords raised concerns about requirements being unduly burdensome for smaller businesses, tech platforms, and start-ups to comply with—an issue we have also flagged in earlier analysis of the Bill.

The Bill’s impact assessment estimated that around 80% of in-scope entities have less than ten employees, meaning that compliance with increasingly stringent regulatory requirements could significantly impact their operational capacity and risk stifling innovation. The need to avoid unnecessary and disproportionate regulatory burdens is heightened in the case of service providers working for the public benefit, which can be a vital source of information and content for users. This has been highlighted by the Wikimedia Foundation in their ongoing advocacy efforts, calling on the government to approve an amendment to exempt public interest platforms from the scope of the Bill.

In terms of enforcement, one of the key changes agreed to in Committee stage was an amendment regarding sanctions on individual social media managers, in response to pressure on the government to introduce prison sentences for social media employees that fail to comply with child safety duties. We previously raised concerns about the wording of some of the amendments put forward for this purpose, which would have represented a sharp departure from approaches taken by like-minded countries and further incentivised social media companies to take a risk-averse approach to content moderation, leading to the over-removal of legal content. While we are still uncomfortable with criminal sanctions for individual social media company employees for non-compliance with content moderation duties, and the ways in which such provisions could be misused or mimicked by authoritarian governments, we are at least pleased that the text which has been agreed upon provides clearer parameters around the scope of the new criminal offence—imposing criminal liability only as a last resort after an information notice, a provisional notice of contravention and a confirmation decision have been issued by Ofcom and ignored by the company.

 

Encrypted channels

An ongoing concern for GPD is the impact of the Online Safety Bill on end-to-end encryption. In recent months, human rights organisations, security experts and online platforms have called on the government to reconsider provisions which would empower Ofcom to order online platforms to use “accredited technologies” to identify child sexual exploitation material which has been shared on both public and private channels. This could force service providers to introduce proactive monitoring of all user content, jeopardising end-to-end encryption and compromising the privacy and security of online communications. At Committee stage, several Lords voiced support for end-to-end encryption, highlighting its benefits and how it protects children online. The lack of clarity on exactly how the Bill will impact encrypted channels was also raised.

Along with over 80 civil society organisations, experts and academics, we have signed onto a letter coordinated by Open Rights Group, urging the government to ensure that end-to-end encrypted channels are removed from the scope of the Bill. With Apple also releasing a statement to a similar effect, pressure is mounting on the government to provide clarity on encryption at Report stage.

 

Need to tackle disinformation

Members of the Lords also discussed how the Bill deals with misinformation and disinformation, and the role of the advisory committee which will be set up to advise Ofcom on the issue (chapter 7 of the Bill). Members seemed particularly interested in tackling health-related misinformation: proposed amendment 52 seeks to require platforms to undertake risk assessments for health misinformation and disinformation and develop targeted policies to tackle the issue. While this amendment was withdrawn, it highlights a specific concern around this category of content, as well as the need for governments to ensure the public availability of accurate information on health risks, to avoid creating an information void which can lead to misinterpretation.

Although the Bill only specifically uses the terms misinformation and disinformation in the context of the advisory committee, this theme is also reflected in the ‘false communications’ offence that was added earlier this year, as part of the government’s shift away from forcing platforms to address ‘harmful’ content by adding new categories of ‘illegal’ content. The clause states that a person commits an offence if they send a message conveying information that they know to be false with intent to cause “non-trivial psychological or physical harm to the likely audience.” While this offence is linked to intent to cause harm, it does not refer to actual harm. This broad framing is concerning given that it is often difficult to determine whether something is ‘true’ or ‘false’ (for example, whether a piece of content should be considered as satire or disinformation). Therefore, failing to link this offence to actual harm risks it being broadly interpreted and used to censor legitimate speech.

 

Child protection and safeguards

The importance of placing children’s rights at the centre of the Online Safety Bill was widely recognised by members of the Lords, and there was broad consensus on the need for higher standards and safeguards to protect children from online abuse and exploitation. Several proposals were made to add explicit reference to the UN Convention on the Rights of the Child (UNCRC) into the Bill, following the examples of SwedenSouth AfricaCanadaScotland and Wales, which—as Lord Russell of Liverpool highlighted—have all incorporated the UNCRC into domestic child protection legislation. While all proposals relating to the UNCRC were withdrawn at Committee stage, this dialogue represents a positive step in UK Parliamentarians’ understanding of online rights and protections, with increasing recognition of the need to ensure platform regulation is aligned with broader human rights frameworks and commitments.

Furthermore, as noted by Baroness Kidron, in previous debates on online safety “child safety was incorrectly cast as the enemy of adult freedom”. Discussions at Committee stage were a welcome departure from this narrative and demonstrated more nuanced thinking around child online safety than we have previously seen. However, while acknowledging the specific onus on the Bill to protect children, several members also cautioned that the Bill should not be oversold as a “silver bullet” that will make the online environment completely risk-free.

 

Looking ahead to report stage

Now that Committee stage is over, the Bill will now move to Report stage, taking place over three sessions scheduled for the 6th, 10th and 12th of July. During these sessions, members of the Lords will have time to further examine the Bill, discuss amendments, and decide on any outstanding issues. The Lists of the amendments to be discussed and voted upon will be published on the House of Lords website.

We’ll continue to monitor discussions, advocating for the Lords to:

  • support a nuanced and balanced Bill that is rights-respecting, includes proportionate protections for freedom of expression, access to information and freedom of association on digital platforms, and aligns with international human rights standards and obligations, including the UNCRC;
  • be mindful of the need to protect the privacy and security of online communications and to oppose undermining end-to-end encryption;
  • ensure there are adequate protections for public interest platforms as set out by the Wikimedia Foundation; and finally,
  • consider how decisions in these final stages of deliberation could impact content moderation in other jurisdictions, particularly in contexts where there may be a lack of independent regulatory authorities or judicial oversight.