UK Online Safety Act: Protection of Children Codes come into force

Alert
|
6 min read

White & Case Tech Newsflash

On 25 July 2025, having successfully passed through the required Parliamentary approval process, the first Protection of Children Codes of Practice for user-to-user and search services (the "Children's Safety COPs") came into force as part of the UK's Online Safety Act 2023 (the "OSA") online safety framework. This marks the completion of 'Phase 2' of Ofcom's implementation of the OSA, under its three-phase 'roadmap'.1

Purpose of the Children's Safety COPs

Under sections 11 and 12 of the OSA, user-to-user services2 (i.e. online services which allow users to share content that may be encountered by other users) that are used, or are likely to be used, by significant numbers of UK children are required to:

  • Conduct a children's risk assessment at least annually, to assess the risk of harm to children on the service from various types of "content that is harmful to children" ("harmful content"). The OSA specifies the types of content, which include: (a) "primary priority content" (such as pornographic content); (b) "priority content" (such as bullying content); and (c) "non-designated content" (such as content that promotes depression, hopelessness and despair). Providers had to complete their first children's risk assessments by 24 July 2025.
  • Comply with certain "safety duties protecting children" (the "children's safety duties"), including requirements to use measures, systems and processes to protect children from harmful content.

The Children's Safety COPs3 were prepared by Ofcom, the UK's online safety regulator, under Chapter 6 of Part 3 of the OSA. They describe specific measures which online service providers are recommended to implement in order to comply with their children's safety duties.

In preparing its recommendations in the Children's Safety COPs, Ofcom took into account responses from a wide range of stakeholders – including online platforms, relevant government and police officials, and children's safety organisations – to its January 2023 call for evidence4 and May 2024 consultation5 about appropriate measures for protecting children online. Ofcom also considered the views of children across different age groups, through workshops, home-based activities and interviews.
The Children's Safety COPs are particularly significant because, under section 49 of the OSA, a provider is deemed compliant with their children's safety duties if they properly implement on their service the applicable measures in the Children's Safety COPs.

Recommended measures

The applicable measures vary for each service, depending on, e.g. the service's user base size and risk levels for different types of harmful content. They cover areas such as: age assurance (i.e. measures to determine whether or not a particular user is a child); content moderation; users' ability to report content and raise complaints; and recommender systems. Examples of the measures, and the services to which they apply, are:

  • Robust age checks: For certain services that permit hosting or disseminating primary priority content (such as pornographic content6) or priority content, implementing "highly effective age assurance" (e.g. credit card or photo ID checks7) to prevent children from accessing such content or the part of the service where it is located.
  • Safer algorithms: For services whose children's risk assessment concluded that the service is "medium" or "high" risk for certain kinds of harmful content, ensuring that content recommender systems exclude, or give a low degree of prominence to, potential harmful content in users' content feeds.
  • Effective content moderation: For 'large' services (i.e. services which have more than seven million monthly active UK users) or 'multi-risk' services (i.e. services whose children's risk assessment concluded that the service is "medium" or "high" risk for two or more kinds of harmful content), establishing internal content moderation policies which set out rules, standards and guidelines around how harmful content is to be dealt with on the service, and how such policies should be operationalised and enforced.

Ofcom expects that the children's safety duties and the measures in the Children's Safety COPs "will have a big impact on the online lives of children in the UK".8

Enforcement

Ofcom has stated that it has put the need to ensure children's protection online "at the heart of" its decision-making, and that it "will be looking for evidence" that providers also put managing the risk of harm to children at the heart of their decision-making and governance.

Perhaps reflecting that prioritisation, and the increasing familiarity that service providers are now expected to have with the OSA framework, Ofcom has also indicated that it may be more likely to take action where providers are slow to comply with the children's safety duties. For example:

  • In the context of the similar 'Phase 1' safety duties relating to illegal content which came into force in March 2025, Ofcom stated that it would generally give providers "a reasonable opportunity to come into compliance". By contrast, Ofcom has stated in the context of children's safety that its Supervision Team is "establishing relationships with the largest and riskiest service providers to ensure they… come into compliance quickly".9
  • With respect to services which allow pornographic content, Ofcom has emphasised that it is "ready to enforce against any company which … does not comply with age-check requirements by the deadline" and that it "will be actively checking compliance" immediately upon the Children's Safety COPs coming into force. Ofcom has already announced that it is formally investigating four providers of pornography sites who may not have implemented highly effective age assurance.

Next steps for OSA implementation

In addition to enforcing the children's safety duties, the remainder of 2025 will be busy for Ofcom as it seeks to achieve various milestones within 'Phase 3' of its OSA implementation, which focuses on 'categorised' services (i.e. the user-to-user, search and messaging services with large numbers of UK users10). These milestones will include publication of a 'register' designating services as categorised services, and publishing transparency reporting guidance.

1 https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/roadmap-to-regulation.
2 Similar duties apply to search services under sections 28 and 29 OSA.
3 See
https://www.ofcom.org.uk/siteassets/resources/documents/consultations/category-1-10-weeks/statement-protecting-children-from-harms-online/main-document/protection-of-children-code-of-practice-for-user-to-user-services.pdf?v=399754 and https://www.ofcom.org.uk/siteassets/resources/documents/consultations/category-1-10-weeks/statement-protecting-children-from-harms-online/main-document/protection-of-children-code-of-practice-for-search-services2.pdf?v=399753.
4
https://www.ofcom.org.uk/online-safety/protecting-children/call-for-evidence-second-phase-of-online-safety-regulation.
5
https://www.ofcom.org.uk/online-safety/protecting-children/protecting-children-from-harms-online.
6 See Ofcom's letter dated 24 April 2025 regarding mandatory age assurance requirements for services that allow pornographic content, at:
https://www.ofcom.org.uk/online-safety/protecting-children/letter-to-part-3-services-that-allow-pornography-outlining-heaa-requirements.
7 See Ofcom's Guidance on highly effective age assurance for 'Part 3 services' at:
https://www.ofcom.org.uk/siteassets/resources/documents/consultations/category-1-10-weeks/statement-age-assurance-and-childrens-access/part-3-guidance-on-highly-effective-age-assurance.pdf?v=395680.
8
https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/online-safety-industry-bulletins/online-safety-industry-bulletin-May-25.
9
https://www.ofcom.org.uk/siteassets/resources/documents/consultations/category-1-10-weeks/statement-protecting-children-from-harms-online/main-document/volume-1-overview-scope-and-regulatory-approach.pdf?v=396663.
10 As defined in the Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025.

White & Case means the international legal practice comprising White & Case LLP, a New York State registered limited liability partnership, White & Case LLP, a limited liability partnership incorporated under English law and all other affiliated partnerships, companies and entities.

This article is prepared for the general information of interested persons. It is not, and does not attempt to be, comprehensive in nature. Due to the general nature of its content, it should not be regarded as legal advice.

© 2025 White & Case LLP

Top