The draft Online Safety Bill (“OSB") is a significant piece of draft legislation that, when enacted, will make provision for the regulation of certain internet services, with the primary aim of tackling illegal and harmful content disseminated online. The UK’s Parliamentary Joint Select Committee (“JSC”) published its recommendations on the draft OSB on 14 December 2021. We outline the main recommendations and key takeaways from the JSC’s report below.
Services in scope
The JSC recommends that the definition of in-scope services be amended to all services which are likely to be accessed by children to catch higher-risk websites such as pornography sites.
Content in scope
The JSC recommends that references to harmful "content" be amended to "regulated content and activity" and to focus on how the design of platforms, not just the content, can be harmful.
The JSC supports the introduction of new criminal offences, including cyberflashing, promotion of serious self-harm, sending threatening communications and disinformation relating to election material.
The JSC recommends that news publisher content should not be “moderated, restricted or removed” unless the content is illegal. There should be protection for content of “democratic importance”.
Duties to tackle content
The report recommends the implementation of a Code of Practice by Ofcom to set out standards for identification and verification of identity in regard to online anonymity.
The report recommends that service providers produce an Online Safety Policy, which is accessible to users to make informed choices when deciding to use platforms.
To assist services, the report recommends that Ofcom produce a binding Code of Practice on how to protect children from illegal and harmful content, including how to achieve proportionate deployment of age assurance methods.
The JSC takes the view that Codes of Practice (which should be binding) should be produced in many areas, including terrorism, CSEA, age assurance and freedom of speech.
The JSC recommends that each regulated service have a designated "safety controller" who could be held criminally liable if the service provider fails to comply with its obligations repeatedly, causing a significant risk of serious harm to its users.
The report provides a useful insight into how the draft Bill might develop in certain fundamental respects. What stands out from the recommendations in particular:
The JSC has honed in on certain aspects of the online world as warranting closer regulation in particular – namely, services that pose a risk to children and the use of algorithms. This is no real surprise given some of the headline-grabbing allegations made about large platforms' practices last autumn. It is safe to expect that these areas will now probably remain the central focus during the bill's passage through Parliament.
We have gleaned some helpful indications as to how Ofcom may assess risk (which will in turn guide how services should assess risk too). This has been a key unanswered question for services to date and has therefore prevented services from really being able to understand how they might fit into the new regime and so start taking preparatory steps.
Baked into the entire legislative proposal is the idea that any steps required of services will be proportionate to the risk they pose. Whilst we now have a better idea of how risk might be assessed, what is proportionate action in light of that risk is a question we still don't know how Ofcom will answer. However, what has also emerged from the JSC's work is that the Ofcom Codes of Practice will be absolutely key to compliance. If they are made binding, we understand that the choice to "opt out" and achieve compliance in an "equivalent" way may no longer be open to services. In which case, Ofcom is going to need to be as prescriptive as possible in its Codes of Practice as to how services can achieve proportionality.
It appears that amendments to the bill, if the JSC recommendations are adopted, could go some way to rationalising unfamiliar legal concepts such as "harmful content" and so reduce the burden placed upon services to make difficult judgment calls as to what's in and what's out of scope. We'd expect the "reasonably foreseeable harm" test to remain a hotly-debated topic as the bill progresses, given the subjectiveness of this test when applied to different categories of users.
A revised Bill is expected to be published shortly, following which it will be placed before Parliament for consideration and debate. The stated goal is that the bill be passed by the end of 2022, but it remains to be seen whether this is achievable given the number of issues identified by the JSC.
Sign up for our Connected newsletter for a monthly round-up from our Regulatory & Public Affairs team.