social media
The ICO has called on 11 social media and video sharing platforms to improve their children’s privacy practices, warning that where platforms do not comply with the law, they will face enforcement action.

ICO - Social media and video sharing platforms put on notice over poor children’s privacy practices

This follows the ICO’s ongoing review of social media platforms (“SMPs”) and video sharing platforms (“VSPs”) as part of the ICO’s Children’s Code Strategy. The ICO’s Tech Lab reviewed 34 SMPs and VSPs focusing on the process young people go through to sign up for accounts. Following its review, the ICO found that there were varying levels of adherence to the ICO’s Children’s Code (the “Code”), and concluded that some platforms were not doing enough to protect children’s privacy. 

What has the ICO said?

The ICO is asking certain platforms about issues relating to default privacy settings, geolocation and age assurance, and asking those platforms to explain how their approach conforms with the Code, following concerns raised by the ICO’s review.  The ICO is also speaking to some platforms about targeted advertising to set out its expectations for changes to ensure their practices are in line with both UK data protection law and the Code, stating that it will “step in and take action” where organisations fail to adequately protect children’s information. 

In addition, the ICO has identified areas where further evidence is needed to improve understanding of how these services are impacting children’s privacy.  To this end, the ICO is launching a call for interested stakeholders including online services, academics and civil society to share their views and evidence on two specific areas:

  • How children’s personal information is currently being used in recommender systems (algorithms that use people’s details to learn their interests and preferences in order to deliver content to them); and
  • Recent developments in the use of age assurance to identify children under 13 years old.

The ICO will then seek to use the evidence gathered to inform its ongoing work to secure further improvements in how SMPs and VSPs protect children’s privacy.  The ICO’s ambition is that its work “supports platforms to develop services that recognise and cater for the fact that children warrant special protection in how their personal information is used, whilst also offering plenty of opportunity to explore and develop online”. 

The ICO has set out further detail around the areas where it considers improvement is needed in a Children’s Code Strategy Update. The key areas identified by the ICO are set out below.

Children’s profiles being set to private by default

The ICO found that of the SMPs and VSPs included in its review, the majority set children’s profiles to private in accordance with the Code.  However, the ICO’s review also found that some platforms do make children’s profiles public by default, and in a small number of cases, users could only set up a private profile if they agreed to pay a fee or opt into a subscription service. In addition, the ICO found that a small number of platforms enabled children to receive “friend” or “follow” requests from strangers and to receive direct messages from strangers by default. The ICO has indicated that it has concerns regarding those platforms which do not set children’s profiles to private by default, particularly where they also allow contact from strangers by default, and has written to five platforms outlining its concerns and calling on them to change their practices within a period of eight weeks, or otherwise face further investigation and potential enforcement action. 

Default geolocation settings

The ICO considers that it can be beneficial for children to share geolocation data (for example, children may let their families and carers track their device, and use it to connect with friends nearby) but that this also carries risks of physical, financial and psychological harm. For example, the ICO notes that recent research has highlighted examples of children receiving unwanted communications from other users after posting images with geolocation data tagged, causing them distress. In addition, the ICO notes that the display of geolocation data could also encourage the escalation of cyberbullying to offline bullying, especially if children forget or are not aware that their location is visible to others.  The ICO also states that the level of granularity of geolocation data is an important factor in considering risks posed to children - the more precise the location displayed, the higher the potential risks.

The ICO’s review concluded that, while SMPs and VSPs typically switch precise location settings off by default for children and do not share children’s locations automatically with other users, some do appear to nudge children to switch geolocation settings on or encourage them to share their location with others through tagging or including location when posting content.

In addition, the ICO found that when making geolocation information public, some platforms appear to share more granular information than others, and noted that it was not always clear whether children can switch geolocation data sharing off if they decide they no longer wish to share such data. 

The ICO is therefore writing to four platforms to clarify their practices in this area and how their approach conforms with the Children’s Code.  The ICO has stated that it is “ready to escalate where necessary to ensure that practices are in line with the best interests of the child”, including by opening investigations with a view to potential enforcement action where the ICO considers this appropriate. 

Profiling children for targeted advertising

The ICO has noted that SMPs and VSPs often gather large quantities of information about their users, including children, drawing on the posts and content they view, people they interact with, accounts they follow, and groups they join.  This data may then be combined with personal information collected at account set-up stage or from third parties, and used for a wide range of purposes including targeted advertising. 

The ICO has noted that while the use of targeted advertising can give users a more personalised and relevant advertising experience, children may not be aware that their information is being gathered or understand how it is being used for advertising purposes.

The ICO considers that where platforms do not take sufficient measures to protect children’s personal information in relation to targeted advertisements, the potential harms may include:

  • a loss of autonomy and control by children over their personal information, which may result in unwarranted intrusion from third parties through push notifications and nudge techniques used to promote products;
  • harm to children’s wellbeing from seeing targeted advertising promoting negative health behaviours based on personal information, for example body dysmorphia, incorrect nutritional information or inappropriate lifestyle choices for children; and
  • financial harm where targeted advertisements encourage in-service purchases and additional app access without adequate protections for children.

The ICO’s review found that some platforms use only limited data points to tailor advertising for children, such as age and high-level location data such as the country to help ensure that advertising is age-appropriate and jurisdiction specific. A small number appear not to show advertisements to children. 

However, the ICO also found that on other platforms it is not always clear what personal information is being collected from children or how it is being used for targeted advertising.  In addition, the ICO’s review found indications that some SMPs and VSPs may be profiling children for targeted advertising in a way that is not in the best interests of the child, for example, through potentially excessive data collection and not giving children options to control advertising preferences.

The ICO has stated that it has begun a “programme of active engagement” with SMPs and VSPs where it has identified potential concerns.  The ICO is currently verifying the approach of the relevant platforms to the use of children’s personal information for targeted advertising and setting out its expectations for changes to those platforms’ data processing practices to bring them into line with the Children’s Code and the UK GDPR.  In addition, the ICO has stated that it will follow up to secure improvements and where necessary consider “all regulatory options”, including investigations, for services that do not comply. 

Use of children’s information in recommender systems

Recommender systems are algorithmic processes that use personal information and profiling to learn the preferences and interests of the user to suggest or deliver content to them. They can take many forms and can use a range of data, including personal information provided directly by the user (or from third parties) and information from users’ interaction with content on the service. Recommender systems may also make recommendations based on content that a user’s online friends, followers or groups engage with.

The ICO recognises that recommender systems can help users, including children, navigate the vast amount of information online to find content they wish to view or engage with. They can also remove or moderate inappropriate content that is served to children and help promote the visibility of positive content. 

However, the ICO also notes that there are a number of potential harms that could occur, including:

  • physical or psychological harm from seeing harmful content where platforms do not provide sufficient protections for children when recommending content based on personal information (such as interests and preferences);
  • physical or psychological harm, for example, sleep disruption, poor attention or addiction, resulting from platforms using children’s personal information to feed children recommendations in a way that promotes sustained viewing; and
  • a loss of autonomy where children are unable to make informed decisions about what information to share with a service because they do not understand how it would be used or are not given sufficient tools to control how it is used by recommender systems.

The ICO has also noted that, given the clear links here with online safety issues, it will continue to work closely with Ofcom (the regulator responsible for online safety) to protect children online. 

(See our article: EU and UK regulators align on online safety and ICO and Ofcom issue Joint Statement on Collaboration)

In its review, the ICO found that a range of personal information is used by recommender systems, including name, age, sex, location, interests and online activity.  However, the ICO noted that privacy notices are often unclear about the specifics of how platforms use personal information to make recommendations, or what measures they take to protect children’s privacy when doing so. Although the ICO’s review did not seek to assess the content shown to children, it does note that research has found that platforms using recommender systems may show children inappropriate and harmful content. 

To further build its understanding of the data processing elements of this issue, the ICO has opened a call for evidence from stakeholders on the use of children’s personal information in SMPs’ and VSPs’ recommender systems, which is open until 11 October 2024.  In addition, the ICO has stated that it will engage directly with platforms to better understand how their recommender systems use children’s personal information, and what steps those platforms take to mitigate the risk of harm to children from using that information in their recommender systems.  The ICO will then use this information and evidence to consider next steps. 

Use of information of children under 13 years old

The ICO highlights that both data protection law and the Children’s Code seek to protect the personal information of children of all ages within the digital world and to ensure an age-appropriate experience, with younger children likely to need a higher level of protection than older children because they are at an earlier stage of development.  This is reflected in the UK GDPR, which states that verified parental consent must be obtained for children under the age of 13 where consent is relied on as the lawful basis for processing personal information.

The ICO notes that if platforms are unclear about or do not know the age of their users, they risk processing children’s information inappropriately and unlawfully as if they were an older child or an adult.  This could lead to a range of data protection harms, including:

  • loss of control of personal information resulting from under-13s creating accounts on services designed for older children with limited awareness of the risks as to how their information may be used or of their information rights;
  • psychological harms from children accessing content they are too young to view, as a result of the platform processing inaccurate personal information; and
  • financial harms from engaging with in-app purchases or subscriptions without adequate parental oversight. 

The ICO’s review found that almost all platforms specified a minimum age of 13 in their Terms of Service.  Many also relied on consent for at least some of their data processing activities.  The ICO notes that these platforms should have effective age assurance processes in place to ensure that they are not processing the personal information of under-13s

The ICO also found that while the majority of platforms reviewed did use some form of age assurance at account set-up stage, most relied on users’ self-declaration of their age.  However, the ICO has stated that, as set out in its Opinion on Age Assurance, self-declaration is unlikely to be appropriate when engaging in high-risk activities involving children’s data, such as: 

  • large scale profiling;
  • “invisible” processing;
  • tracking; or
  • targeting children for marketing purposes or to offer services directly to them.

The ICO has stated that platforms engaging in these activities should adopt an alternative age assurance method with an appropriate level of technical accuracy, and which operates in a fair way to users.  For example, this might include age verification or age estimation solutions, as set out in the ICO’s Opinion on Age Assurance.

The ICO also found that a small number of SMPs and VSPs in its sample did not appear to use age assurance at the account set-up stage at all, and has stated that its initial priority in this area is to address this issue.  The ICO is therefore writing to four platforms to clarify the lawful bases they rely on to process children’s information and their approach to age assurance, and will decide on next steps based on their responses. 

In addition, the ICO is considering the practices of platforms that rely on consent as their lawful basis for processing the information of users who are not logged in (particularly those under 13 who would require verified parental consent) and is inviting evidence from stakeholders on developments in the use of age assurance by platforms to identify children under 13, which will inform the ICO’s future work in this area. The call for evidence will be open until 11 October 2024.

What are the implications?

The ICO’s announcement highlights its continued focus on children’s online privacy.  The ICO is clearly willing to engage with industry through its calls for evidence to help inform its ongoing work in this area and identify appropriate solutions, and appears to be giving platforms an opportunity to address the issues identified before taking any action.   Nevertheless, it is clear that the ICO will consider enforcement action for organisations who do not comply.  Online services likely to be accessed by children should therefore take note of the recommendations set out by the ICO in the Children’s Code Strategy update, and ensure they are complying with the Children’s Code.

Authors