On 3 March 2025, the UK Information Commissioner's Office (ICO) announced a series of investigations into the use of children's personal information by three major social media and video sharing platforms: TikTok, Reddit and Imgur. These investigations are part of the ICO's ongoing efforts to ensure that children's personal information is protected online, particularly focusing on users aged 13 to 17. The primary concern of the ICO is how these platforms use children's data to deliver content recommendations and whether their age assurance measures are adequate.
Recommender systems
The TikTok investigation is focused on how the platform uses the personal information of 13–17-year-olds to make content recommendations to their feeds. The ICO is concerned that the recommender systems of TikTok (algorithms) could potentially expose child users to inappropriate or harmful content.
Age assurance
On the other hand, the investigations into Reddit and Imgur are centred around their use of age assurance measures. These are tools that verify, or at least estimate, a user's age, allowing services to recognise child users and tailor content appropriately. The ICO is examining how Reddit and Imgur assess and verify the ages of their child users to ensure that services are tailored to children's needs and that access to age inappropriate content is restricted.
Investigations only
The ICO has clarified that at this stage, it is only investigating whether there have been any infringements of data protection legislation. If it finds evidence of this, the company in question will be able to make representations before any enforcement action is taken.
Children's data and the regulatory stance
These investigations come in the wake of a broader, global, regulatory shift towards the protection of children's data, with the UK's ICO actively working to improve children's online privacy since the introduction of the Children's Code (the Age Appropriate Design Code) in 2021. (For more information see our article here).
In addition to this, the coming into force of various provisions of the Online Safety Act will further enhance protections for children online by imposing a duty for online services to conduct children's risk assessments, prevent children from accessing harmful and age-inappropriate content and comply with specific codes of practice. (For more information see our article here).
So how did the ICO end up opening these three investigations? Following an ICO review into the varying levels of adherence to the Children's Code by 34 social media and video sharing platforms, concerns were raised about default privacy settings, geolocation, age assurance and targeted advertising practices affecting children's data protection. On 2 August 2024, 11 platforms were called on to improve their children's privacy practices, with the ICO warning that where platforms do not comply with the law, they will face enforcement action. (For further information see our article here). Improvements were seen after the August intervention but where more evidence was needed investigations were opened in these three cases.
Conclusion
"Safeguarding children's personal information is a key priority" for the ICO, and has been so for a number of years. With that in mind, companies operating online platforms, particularly those that are likely to be accessed by children, should ensure their compliance with UK data protection laws and the Children's Code. While a degree of complacency may have set in due to little visible enforcement action other than the Tik Tok fine for £12.7 million in April 2023, it seems something is in the air this year and more enforcement is expected.
With key provisions of the Online Safety Act starting to come into force (including the Protection of Children Codes of Practice in July 2025) it is interesting to compare the rhetoric around enforcement from Ofcom. It is much stronger in tone than we have seen to date from the ICO, e.g. Suzanne Cater, Enforcement Director at Ofcom, said "Platforms must now act quickly to come into compliance with their legal duties, and our codes are designed to help them do that. But, make no mistake, any provider who fails to introduce the necessary protections can expect to face the full force of our enforcement action." Will 2025 be the year we start to see fines concentrating minds and compliance with the Children's Code and Online Safety Act higher up on the boardroom agenda? The stakes are high as reputation and trust are also factors in this sector.
So what should I be doing?
It will be prudent for companies using recommender systems to ensure that their algorithms do not expose children to inappropriate or harmful content. In addition to this, platforms should limit the use of children's personal information for targeted advertising, in fact the Children's Code says unless there is a compelling reason to use profiling, it should be turned off by default (and as we know the EU's Digital Services Act (DSA) prohibition on targeted advertising to children is now in force).
Companies must ensure that children's profiles are set to private by default and restrict the sharing of geolocation data. Clear options should be provided for children and their parents to control privacy settings, particularly for children under 13 years old who cannot consent to an online service processing their personal data.
All of these areas were highlighted in the ICO's Children's Code strategy and recent update so it is clear where the regulatory focus lies. Furthermore, understanding the latest thinking on age assurance from the ICO and the European Data Protection Board (EDPB) is essential for those operating in this sector.
With the ICO stating "Our work to drive further improvements in children's privacy will continue in 2025/26" it is vitally important for those in scope to get their house in order, if they have not done so already.
If you have any questions or we can help in anyway please do reach out to your usual LS contact who will be happy to help.