Skip to main content
 

The Online Safety Bill: What to expect

Tuesday 31 January 2023

The government’s plan to bring the Online Safety Bill into law is progressing through parliament, and will be debated in the House of Lords on 1 February 2023.

It is proposed that the Bill will, amongst other things, introduce a new set of laws to protect children and adults online and is intended to impact all social media and technology companies by making organisations more responsible for their users’ safety on their platforms.

The much-debated Bill has caused controversy for the stringent nature of its new regulatory regime which could place a substantial burden on tech companies. It has also undergone a name change along the way.

The Bill has attracted attention concerning the question of whether the proposed legislation would seek to ban online speech which is legal offline. Within the Tory party itself, concerns were raised that the Bill equated to online censorship. This recently resulted in the abandonment of the proposed prohibition on “legal, but harmful” content.

Where is the Online Safety Bill up to?

The Bill is currently at the second reading stage in the House of Lords. It is expected that a phased approach will be taken to implementing the new law if and when the Bill has been officially passed.

As such, assuming the Bill is passed, it would be prudent for organisations affected by the Bill to consider at this stage what new measures (if any) they will need to implement to review and regulate the content that they publish online.

What is the Online Safety Bill intended to do and how will it work in practice?

The Bill appears to target social media companies, but will actually apply to any organisation which utilises social media and other online platforms to host user-generated content and for search engines, focusing on minimising harmful search results to users.

According to the government, the Bill has five policy objectives:

  1. to increase user safety online.
  2. to preserve and enhance freedom of speech online.
  3. to improve law enforcement’s ability to tackle illegal content online.
  4. to improve users’ ability to keep themselves safe online.
  5. to improve society’s understanding of the harm landscape.

The Bill is intended to deal with illegal content and activity, specifically aimed at protecting two groups:

  1. Children

Companies will be required to:

  • remove illegal content quickly or prevent it from appearing in the first place. This  includes removing content promoting self harm;
  • prevent children from accessing harmful and age-inappropriate content;
  • enforce age limits and age-checking measures;
  • ensure the risks and dangers posed to children on the largest social media platforms are more transparent, including by publishing risk assessments; and
  • provide parents and children with clear and accessible ways to report problems online when they do arise.
  1. Adults

Companies will be required to:

  • Remove all illegal content;
  • Remove content that is banned by their own terms and conditions; and
  • Empower adult internet users with tools so that they can tailor the type of content they see and can avoid potentially harmful content if they do not want to see it on their feeds

The Bill is set to impact international companies, if the offending content is accessible to UK users.

“Illegal” content 

The aim of the Bill is to prevent new illegal content appearing online, whilst ensuring that existing illegal content is filtered or removed.

Examples of illegal content that the Bill specifically targets include:

  • Controlling or coercive behaviour
  • Fraud
  • Hate crime
  • Inciting violence
  • Promoting or facilitating suicide
  • Promoting self-harm
  • Revenge porn
  • Selling illegal drugs or weapons
  • Sexual exploitation
  • Terrorism.

“Harmful” content

‘Harm’ is vaguely defined in the Bill as ‘physical or psychological harm’. This has been extended to include circumstances where:-

  1. As a result of the content, individuals act in a way that results in harm to themselves or that increases the likelihood of harm to themselves; or
  2. As a result of the content, individuals do or say something to another individual that results in harm to that other individual or that increases the likelihood of such harm.

This criteria has been criticised for being too subjective on the basis that legal content which might be deemed harmless to one user might be considered harmful or age-inappropriate to another. Examples of legal but potentially harmful content include:

  • pornographic content;
  • online abuse, cyberbullying or online harassment (falling short of criminal conduct); and/or content that does not meet a criminal level but which promotes or glorifies suicide, self-harm or eating disorders.

Duty of Care

The Bill will implement a duty of care model requiring organisations to ensure that proportionate measures are in place to prevent children and adults from accessing the prohibited content online. Organisations will be required to update their terms and conditions to explicitly set out the content which must be avoided and enforce any breaches.

There are concerns that the new provisions will cause social media companies, in particular, to be overly censorious in their approach to online publications as the burden for monitoring illegal content online is onerous and expensive. The sheer scale of content uploaded every minute is almost impossible to monitor efficiently.

Ofcom

Ofcom has been confirmed as the regulatory body tasked with ensuring that companies implement the required processes to meet the Bill. Ofcom will issue codes of practice relating to the particular duties of care owed by providers of user-to-user services and search engines.

Non-compliant companies risk being fined up to £18 million or 10 per cent of their annual global turnover or risk their websites being blocked altogether.

A more recent amendment to the Bill has introduced criminal sanctions against senior managers who fail to follow information requests made by Ofcom. This provision is currently the subject of discussion in Parliament and could be subject to amendment.

Concluding thoughts

The word ‘proportionate’ is used throughout the proposed Bill as an indicator of the extent of preventative measures to be implemented by companies. At present, there is a lack of clarity on what is deemed to be a ‘proportionate measure’. This is surely not a one-size fits all standard and it remains to be seen how this will be interpreted by smaller organisations versus large multinational corporations.

Whilst the overall approach of the Bill is aimed at safeguarding freedom of expression, it also remains to be seen whether the Bill will have the opposite effect in practice. Commentators have pointed to the difficulties in targeting potentially harmful material without accidentally suppressing material which is legitimate.

Sign up, keep in touch

Receive our latest updates, alerts and training and event invitations.

Subscribe