Online platforms and search engines will be required to introduce new safety measures from 17 March 2025, following Ofcom's publication of its final illegal harms Codes this week.

Providers must assess the risk of illegal harms on their services by 16 March 2025. The Codes must be paid before parliament.  Therefore, subject to the Codes completing the parliamentary process, from 17 March 2025, providers will need to take the safety measures set out in the Codes or use other effective measures to protect users from illegal content and activity. 

The Codes are aimed at facilitating the following changes by tech platforms: 

Putting managing risk of harm at the heart of decisions. Every site and app within the Act's scope will need to complete a "suitable and sufficient" risk assessment. They need to understand the risks illegal harms pose to users on their service, and consider how best to tackle them. To ensure strict accountability, each provider must name a senior person responsible for complying with the duties to combat illegal harms, such as terror, hate, and fraud, among many others. 

Better protections from the full range of Illegal Harms, including hate and terror. Providers will need to take down illegal content of all types, and maintain appropriately resourced and trained content moderation teams. Reporting and complaints functions must be easier to find and use, with appropriate action taken in response. Relevant providers will also need to improve the testing of their algorithms to make illegal content harder to disseminate.

Protecting children from abuse and exploitation online. The Codes include measures to tackle online grooming. This means that, by default, children's profiles and locations – as well as friends and connections - will not be visible to other users, and non-connected accounts cannot send them direct messages. Children should also receive information to help them make informed decisions around the risks of sharing personal information, and they should not appear on lists of people users might wish to add to their network. This will make it harder for perpetrators of grooming activity to identify and contact vulnerable children. The Codes also set an expectation that high-risk providers use an automated tool called hash matching to detect Child Sexual Abuse Material (CSAM). This aims to help prevent the circulation of such material, disrupting offenders, and alerting services to report these offences. Ofcom has also expanded the scope of its CSAM hash matching measure to capture smaller file hosting and file storage services. These services are at particularly high risk of being used to distribute CSAM. 

Identifying fraud. Under the Codes, providers will establish a dedicated reporting channel for organisations with fraud expertise. This aims to help them quickly identify fraudulent activity.

Protecting women and girls. Women and girls are disproportionately affected by online harms. The Codes include measures so that users can block and mute others who are harassing or stalking them. The Codes also require providers to take down intimate image abuse (or "revenge porn") material when they become aware of it. There is also guidance about how providers can identify and remove content posted by organised criminals who are coercing women into prostitution against their will. The guidance also sets out how to identify illegal content such as intimate image abuse, sexual exploitation and cyberflashing.

Sanctions and enforcement. 

Ofcom can take enforcement action as soon as the duties come into effect, and it says that "while we will support providers to comply with their duties, we won't hesitate to take early action against deliberate or flagrant breaches". Ofcom has the powers to impose penalties of up to £18m or 10% of the provider's qualifying worldwide revenue (whichever is greater), as well as seeking, in the most serious cases, the blocking of services to UK users through a court order.

Ofcom intends to consult further in Spring 2025 on expansions to the Codes. This will include proposals in the following areas: 

  • Banning the accounts of those found to have shared CSAM;
  • Crisis response protocols for emergency events (such as the riots in August 2024);
  • Use of hash matching to prevent the sharing of non-consensual intimate imagery and terrorist content; and 
  • Using AI to tackling illegal harms including CSAM. 

Ofcom will also be publishing further guidance in 2025:

  • January 2025: Final age assurance Guidance for publishers of pornographic content, and children's access assessments;
  • February 2025: Draft Guidance on wider protections for women and girls;
  • April 2025: Final Codes and Guidance on the Protection of Children; and
  • Spring 2025: Consultation on additional Codes of Practice measures. 

In addition, Ofcom will be publishing updates with regards to categorised services:

  • Summer 2025: publish the register of categorised services;
  • Summer 2025: issue draft and final transparency notices to categorised services; and
  • Early 2026: publish draft proposals regarding additional duties on categorised services.

Ofcom has also published its register of risks and its illegal content judgements guidance, as well as record keeping and review guidance and the final enforcement guidance. 

If you think you are caught by the Act, now is the time to start reading Ofcom's guidance as there is a lot of it!

Ofcom publishes final illegal harms Codes

Authors