UK Online Safety Bill under the microscope

Written By

The Online Safety Bill is expected to be adopted in the UK around November 2023. It seeks to regulate online providers of user-to-user content and search services, as well as providers of pornography. It will impose a duty of care on them to conduct risk assessments and take proportionate measures to deal with risks.

The duty of care will require online providers to address various types of risks, including harmful content, cyberbullying, and online grooming. Providers will need to take steps to remove harmful content, prevent its re-upload, and ensure that users are protected from it. They will also need to ensure that age verification and age assurance measures are in place to prevent children from accessing inappropriate content.

Are you in scope?

The bill will have different tiers of regulation for user-to-user platforms with only one for search. The bill will impose additional duties on very large user-to-user services, known as Category 1 services. Ofcom will work alongside Government to determine the threshold for Category 1. Research will be required to determine where the threshold should lie, which will probably be based on the scale of business, functionality, and a third criterion that is not yet known.  Further, to address the “tipping risk” of excluding companies close to becoming a Category 1 provider and given the importance of managing online safety risks, a new subcategory will also be provided for providers that are on the brink of becoming Category 1.

 

Is the bill likely to change?

While the basic structure of the Online Safety Bill is unlikely to change as it passes through the final stages of the UK parliamentary process, there will be amendments and clarifications to ensure its effectiveness. Definitions of “primary priority content that is harmful to children”, “priority content that is harmful to children”, 'age assurance', and 'age verification' have already been added at the report stage in the House of Lords, with Ofcom tasked with determining what constitutes adequate measures.

The House of Lords also introduced a chapter dedicated to deceased children whose death has a social media connection to clarify the situation involving the families of such deceased children, with Ofcom tasked to produce guidance to help providers comply with their duties in this area.

How to do risk assessments?

The Online Safety Bill will impact a vast number of entities, from big tech companies to small indie platforms, with an estimated capture of 25,000 entities in the UK alone. However, Ofcom will likely prioritise the big tech platforms in the first instance. To help with this process, it is recommended that companies within the bill's scope proactively conduct risk assessments and demonstrate that they have considered potential risks. Ofcom will focus on companies that it believes are not attempting to comply, but it's important to note that the duties of care will not apply until codes of practice are published (during the course of 2024).  

Regarding the practice of risk assessments, Ofcom will seek assurances that the process is being thought about at senior levels of an organisation. The more the organisation communicates to Ofcom about governance, the more likely Ofcom will be satisfied with the organisation's response and approach. Organisations may wish to plan ahead accordingly.

Public consultations will take place in the coming months, and the direction Ofcom will take remains uncertain. As such, it is crucial for companies to stay informed and prepared to adapt to any changes.

Ofcom recently announcedthat it plans to release the initial draft codes of practice soon after it acquires its powers. Additionally, it anticipates publishing draft guidance on age verification in autumn 2023, followed by further codes in the coming months.

Policing content on platforms: legal but harmful?

The concept of ‘legal but harmful’ for adults has been somewhat watered down from previous proposals. The bill now gives users the capacity to say if they want to see harmful content or not (e.g., a tick box), which is not too different in practice from previous proposals for platforms. Indeed, in practice they will still have to find a way to deal with this type of content.
 

Policing content on platforms: illegal content

It will be difficult in many cases for providers to identify content as illegal where illegality requires a mental element or the absence of a valid defence. Ofcom will have to do some work interpreting the relevant Clause of the bill, which requires that the provider must make that judgment based on 'reasonably available' information, and there is a good opportunity for businesses to influence the direction that Ofcom takes here.
 

Criminal liability for directors

The UK Government has stressed that the bill is intended to have extraterritorial effect, both with penalty notices and individual criminal liability. Penalties and individual criminal liability could be imposed on companies that fail to comply with the bill's regulations. Criminal liability will only come into play if Ofcom asks a director or company to take action, and they do not respond positively or ignore the request. If this happens, Ofcom could issue a penalty notice, and the director could be held individually responsible for the failure to comply.

What about end-to-end encryption?

A particular issue to note is the power Ofcom will have to issue notices requiring ‘accredited technology’ to identify, take down and prevent users from encountering CSEA (‘child sexual exploitation and abuse content’), whether it is communicated publicly or privately by means of the relevant service.  However, a new requirement has been added for a Skilled Person report to be issued before Ofcom can issue such a notice. This power would seem to encompass the use of technology which would be incompatible with end-to-end encryption, though Ofcom may only issue such notices if it considers it ‘necessary and proportionate’ to do so, meaning that, as will be the case elsewhere in the legislation, the meaning of ‘proportionate’ will be important in this context.
 

What about internal user-to-user functionalities?

If an organisation makes use of a wholly internal user-to-user functionality, then it will not be in the bill's scope. Organisations with user-to-user platforms that are mostly internal but include some external functionality may want to switch off the external parts of the platform to remain out of scope.

The Online Safety Bill is a significant piece of legislation that will impact a wide range of online providers. Companies should begin to take proactive steps to comply with the bill's regulations, including conducting risk assessments and staying informed about potential changes. There is still time for companies to influence the direction of the codes of practice that will need to be published by Ofcom.

For more information contact James Moss and Anthony Rosen

 

SIGN UP FOR OUR CONNECTED NEWSLETTER FOR A MONTHLY ROUND-UP FROM OUR REGULATORY & PUBLIC AFFAIRS TEAM.

Latest insights

More Insights
Curiosity line yellow background

China Cybersecurity and Data Protection: Monthly Update - April 2024 Issue

Apr 26 2024

Read More
Curiosity line teal background

Bring out the wine and cheese: Enhanced protection for European GIs in New Zealand

Apr 26 2024

Read More
Green paper windmill

Green Gold: Navigating Mandatory Climate Disclosure and ESG Strategies

Apr 26 2024

Read More