Skip to main content
 

UK Online Safety Bill set to impose criminal liability for tech companies

Wednesday 19 April 2023

A major discussion point in the UK centres around the Online Safety Bill (‘the Bill’), which is currently at the committee stage in the House of Lords.

The Bill has invoked mixed feelings among the public — some see it as a draconian measure that poses a risk to freedom of speech, while others believe that it doesn’t go far enough to prevent harmful content online. Here, our Senior Associate and data protection expert Eleanore Beard breaks down what the bill means for both internet users and technology companies.

 

What is the Online Safety Bill?

The Online Safety Bill is intended to introduce a new set of laws to protect children and adults from illegal or harmful content and will impose a duty of care to keep people safe online. It is hoped that the Bill will ensure that illegal content (which includes material relating to abuse, self-harm and other content which could cause harm) is quickly removed or prevented from appearing at all.

The Secretary of State wrote an open letter in December 2022 about what the government wanted to achieve from the Bill:

The strongest protections in this legislation are for children and young people. This Bill will protect them by:

  • Removing illegal content, including child sexual abuse and terrorist content
  • Protecting children from harmful and inappropriate content, from cyberbullying and pornography to posts that encourage eating disorders or depict violence
  • Putting legal duties on social media companies to enforce their own age limits — which for almost every single platform are set at age 13, and yet are rarely enforced
  • Making tech companies use age checking measures to protect children from inappropriate content
  • Making posts that encourage self-harm illegal for the first time — both for children and adults
  • Ensuring more transparency on the risks and dangers posed to children on the largest platforms, including by make tech companies publish risk assessments

Adults will be covered by their own separate ‘triple shield’ of defence. You will be protected from posts that are illegal; from content that is prohibited by the social media companies in their own terms and conditions; and you will be given more control over the content you see on your own social media feeds.”

The Bill will also be supported by the ICO’s Age-appropriate design code.

 

What powers will the Online Safety Bill provide?

The Bill will give Ofcom powers to fine businesses, block access to sites and also see the introduction of criminal liabilities for executives of companies who are caught by the Bill and fail to cooperate with Ofcom. The Bill is intended to give tech companies increased responsibility for users’ safety on digital platforms and the Government has published a fact sheet to provide information on (and examples of) the new communications offences.

In the Law Commission’s review — ‘Modernising the Communications Offences’ — it was recommended that the Malicious Communications Act 1988 and section 127 of the Communications Act 2003 be replaced, as they were out of date and unable to adequately deal with the harmful behaviour now being seen online. The Bill will replace these existing and outdated laws.

 

New offences and criminal liability

The Bill will, in summary, break down the new communications offences into three different types of behaviour: harmful communications offences, false communications offences and threatening communications offences.

Notably, the harmful communications offences will shift focus from the content of a communication to its potentially harmful effect. A new offence of ‘cyberflashing’ will amend the Sexual Offences Act 2003 to include a specific offence that targets the unsolicited sending of sexual images using digital technology.

The new offences will ensure that there is criminal liability for such harmful behaviours and place a new duty of care on companies to protect their users, while (according to the Government) still attempting to ensure that users’ freedom of expression is protected.

 

What the Online Safety Bill means for social media platforms

If the Bill is implemented, tech companies will not only have to remove harmful content, but also prevent harmful content from appearing in the first place. Failure to do so may result in significant fines, with Ofcom given powers to impose fine companies of up to £18 million or 10% of a company’s global turnover (whichever is greater). This may not only affect UK companies, as powers will be granted to take action against companies based outside of the UK if a platform is accessed by users in the UK.

The Bill, if passed, will impact many companies that handle data on social media platforms and digital messaging services. These companies will need to review internal processes and potentially implement more regulatory frameworks to comply with the Bill. There will also likely be a need to make changes to how social media platforms are operated and how terms and conditions and guidance are upheld.

 

An overdue safety measure or freedom of speech concern?

As the parent of a pre-teen (who can barely raise their head above an iPad and is influenced by any number of social media platforms), it is frightening to think of the types of content (like cyberflashing) that children can be exposed to. With this in mind, the Bill is arguably long overdue and will no doubt be welcomed by much of the public. However, others may be concerned by the potential for censorship and the loss of freedom of speech.

The Bill has received backlash from some large tech companies, with WhatsApp and Signal among those that have published an open letter which states their intention to shut down in the UK if the Bill is passed as law. Critics state that the Bill may actually weaken users’ privacy online by undermining robust encryption software.

Current encryption software scrambles messages so that even the company running the service cannot view their content. The Bill would require private encrypted messaging apps and other services to adopt technology that identifies and removes harmful material.

Will Cathcart, the Head of WhatsApp at Meta, recently spoke to the BBC to state that "[WhatsApp] users all around the world want security — 98% of our users are outside the UK, they do not want us to lower the security of the product". Cathcart also confirmed that he “… won't lower the security of WhatsApp. We have never done that — and we have accepted being blocked in other parts of the world."

While the Government states that it has tried to find a balance between the need to ensure that companies have liability when users are not adequately protected and the right to freedom of expression, we will have to wait to see what the final version of the Bill achieves.

If you or your organisation have any queries or concerns in relation to the Online Safety Bill or users’ safety online, please contact myself and our data protection team at eleanore.beard@brabners.com.

Share

Sign up, keep in touch

Receive our latest updates, alerts and training and event invitations.

Subscribe