The Online Safety Act

The recently introduced Online Safety Act is intended to bring legislation in line with current-day internet usage to expand online safety protections. It has a widened scope, with potential to impact a diverse range of sectors.

24 January 2024

Child using phone online


The Online Safety Act (the “Act”) is legislation that aims to protect children and tackle illegal content online. The Act came into force on 26 October 2023, and follows a trend towards protecting online users in other jurisdictions (e.g. the Kids Online Safety Act in the US and the EU Digital Services Act).

The Act has been introduced due to substantial changes in online behaviours since the early days of the internet. This increased scope has coincided with a rise in mental health harms prevalent on the internet today.

The power dynamics which existed in the early days of the internet have also shifted, with online platforms being originally seen as mere conduits on a par with internet service providers, but now being recognised as having far more control and oversight. 

The existing liability framework in the United Kingdom stems from the E-Commerce Directive 2000, consisting of a two-tier liability system which attributes different liability to (i) publishers and (ii) intermediaries. Liability for platforms is therefore only triggered for failure to act when they are placed on notice that they are hosting/sharing illegal content.

Who is in scope?

The Act will apply to providers of online services, specifically search and user-to-user services.

Ofcom (the communications regulator currently in charge of broadcasting, telecommunications, and the postal industries) has been appointed as the regulator and enforcer. It anticipates that more than 100,000 online services could be in scope from a diverse range of sectors including social media, file sharing, gaming, dating, and adult services.

The Act categorises services depending on whether they meet certain “threshold conditions” according to their size, risk, and reach. Additional duties are imposed for larger services with higher reach and multiple risks. The thresholds for these categorised services will be set by legislation following advice from Ofcom.

What does the Act do?

The Act imposes duties of care on two types of intermediary service providers:

  1. User to user services; and
  2. Search engine services.

Similarly to the General Data Protection Regulations in the UK and EU, these duties apply to any organisation with online services in the United Kingdom, regardless of the jurisdiction in which they are established.

Duties of care

The duties of care imposed by the Act are not traditional duties of care (e.g. duty of care while driving). Rather, these duties are enforceable only by Ofcom. It would therefore not create a cause of action for individuals against online platforms that breach their duties of care. 

The duties of care also apply obligations differently depending on the scale and reach of the service providers. General duties will apply to all service providers, with additional duties being imposed on providers with services likely to be accessed by children, and still further duties to apply to high-risk, high-reach providers.

The aim of the duties is to create systems and processes that limit exposure to harmful content. This means that duties will not have been breached by any individual instance of harmful content being shown, but rather it will require companies within scope to have effective governance measures in place to reduce exposure.

The duties can be categorised into five classes:

  1. Risk assessment duties. These require online service providers to assess types of harmful content to which they facilitate access, in addition to the risk of harm it poses.
  2. Safeguarding duties. These duties are aimed at ensuring that providers have proportionate measures in place to prevent or mitigate the harm of content to which they provide access.
  3. Individual rights duties. Providers must have reporting and complaints mechanisms in place, along with filtering tools and an opportunity to object to content being removed.
  4. Transparency duties. These necessitate providers to have clear and transparent terms to provide individuals with information about the safety practices that they have in place.
  5. Other rights duties. Providers need to have regard to other rights such as free speech and privacy.

Harmful content

Harmful content is categorised into three main categories:

  1. Illegal content (e.g. specifically listed offences, child safety, and terrorism content);
  2. Lawful but harmful to children content (e.g. eating disorders); and
  3. Fraudulent advertising.

What will Ofcom do if an organisation fails to comply?

Ofcom has a central role to design and operate the online safety regime. A key way in which it will do this is through codes of practice which will prescribe the way in which service providers can comply with the duties. Ofcom also has extensive enforcement powers, with the ability to issue service cessation orders or to impose penalties up to the higher of £18m or 10% of global turnover.

Section 51 of the Act states that the duties will only apply once the codes of practice have been approved by Parliament. These codes will be issued in relation to the three categories of harmful content to set out how each of the duties will apply to them, as well as defining when service providers become categorised services.


Ofcom will approach the publication of codes of practice in three phases:

  • Phase 1: Illegal harms. The consultation was published in November 2023, with responses required by February 2024. The code of practice and subsequent enforcement are expected to be in force from the end of 2024.
  • Phase 2: Child safety duties and pornography. The consultation was published in December 2023, with responses required by March 2024. The code of practice and ensuing enforcement are expected to be in force from mid-2024.
  • Phase 3: Duties on categorised services. Ofcom is set to publish advice to the Secretary of State on categorisation and transparency reporting in spring 2024. Ofcom intends to publish a register of categorised services by the end of 2024, with the final categorised services’ Codes to be approved and enforcement carried out by early 2026.  


If you have any questions or need any assistance regarding the above, please contact Joseph Fitzgibbon to discuss.


This article was co-authored by Trainee Thomas Mackie