Ireland's supervisory authority has levied a record breaking fine of €405 million against the social media platform Instagram, following an investigation into its handling of children's data privacy.

Background

The fine, the second highest given under the General Data Protection Regulation ("GDPR") (the first being a €746 million penalty against Amazon) and the third fine for a Meta-owned company from the Irish Data Protection Commission ("DPC"), covers alleged violations of privacy laws as a result of Instagram's default account settings for business account users, which allowed children's email addresses and phone numbers to be exposed.

The penalty follows an investigation by the DPC into the social media platform, which began in October 2020. The preliminary decision by the Irish regulator was met with disapproval among concerned supervisory authorities and so was subject to the dispute resolution mechanism under Article 65 of the GDPR, which enables the European Data Protection Board ("EDPB") to adopt binding decisions in cases where supervisory authorities cannot agree on the interpretation of some elements of the GDPR.

What was the issue? 

Specifically, the EDPB upheld objections from other supervisory authorities requiring the DPC to amend its draft decision (as it was at the time) to include a finding of a GDPR
Article 6(1) infringement, and to reassess the DPC's recommendation of a €405 million fine in light of the infringement. The DPC's decision was adopted on 2 September 2022 and included findings of infringements of:

  • Article 5(1)(a) - the requirement for data to be processed lawfully, fairly and in a transparent manner in relation to the data subject (‘lawfulness, fairness and transparency’ principle), on the basis that it was not sufficiently clear to children that their data would be made public and the risks associated with that;
  • Article 5(1)(c) - the requirement for data processing to be adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed (the ‘data minimisation’ principle), on the basis that it was excessive to make the children's data public when converting to a business account e.g. their email address, particularly bearing in mind that there are other ways people can be contacted on Instagram; for example, through the direct messaging function; 
  • Article 6(1) - the requirement for processing to rely on one of the lawful bases given in this provision, on the basis that the DPC and by extension the EDPB did not agree with Meta's chosen lawful basis (see further below);
  • Article 12(1)/Article 13 (1) -  the requirement to provide data subjects with the relevant information in concise, transparent, intelligible and easily accessible form, using clear and plain language, for the same reasons given above - the decision did note that Instagram did update this transparency wording in September of 2019 so that it was sufficiently clear for the purpose of Article 12 and 13, but still not enough to pass the legitimate interest test, given the additional hurdles around necessity of processing and the overall risk of harm; 
  • Article 24 - the requirement to put in place appropriate policies, on the basis that Instagram had not implemented proper policies bearing in mind the potential harm to children through the platform; 
  • Article 25(1) and (2) - the requirement to implement privacy by design principles, on the basis that the safety and security measures taken by Instagram were not adequate for protecting children's data; and 
  • Article 35(1) - the requirement to carry out a Data Protection Impact Assessment (“DPIA”), on the basis that an insufficient DPIA was carried out despite the processing involved being high risk.

No lawful basis 

While all the infringements listed above were considered sufficiently serious by the EDPB (hence the large fine handed down), the one that we consider to be particularly striking was the decision around whether or not Meta had a lawful basis for the processing.

Meta argued that it relied on two legal bases for the processing of children's data in connection with the provision of their services; either:

  • Article 6(1)(b), processing necessary for the performance of a contract (in the EU Member States where children are allowed to enter into contracts by national law); or 
  • Article 6(1)(f), processing in the legitimate interest of the controller (for all other jurisdictions). 

In their initial decision, the DPC had accepted the legitimacy of Meta relying on both the bases above but was overruled by the binding decision adopted by the EDPB. In its assessment of how reliance on an elected legal basis should be analysed, the Board adopted an expansive approach on both counts but particularly in relation to contractual necessity where it took a very narrow view of when data processing could be said to be "necessary" for the performance of a contract. 

As such, the DPC's final decision was altered to include findings that neither legal basis selected by Meta under Article 6(1) could be relied upon and the processing therefore infringed the Article for having no applicable legal bases of processing. This is worrying for controllers dealing with children's data as it effectively means that in a number of cases, consent will be the only viable lawful basis - which is likely to be problematic, particularly where parental consent is required. 

Not just a fine 

In addition to the substantial fine, the DPC has also issued a reprimand and an order requiring Meta Platforms Ireland Limited to bring its processing into compliance by taking a range of specified remedial actions. 

Meta response 

A Meta spokesperson said, in response to the action taken; "This inquiry focused on old settings that we updated over a year ago, and we’ve since released many new features to help keep teens safe and their information private."

They went on to say that: "Anyone under 18 automatically has their account set to private when they join Instagram, so only people they know can see what they post, and adults can’t message teens who don’t follow them. We engaged fully with the DPC throughout their inquiry, and we’re carefully reviewing their final decision.” 

Meta has confirmed it intends to appeal the fine. 

What can we take from this? 

  1. Underlying importance of privacy by design and growing need to age gate 

The issue that Meta faces is one that arises for many platforms who haven't observed the principle of data protection by design - in other words, they haven't started out with privacy "baked in". 

As part of the DPC's investigation into whether Meta had complied with its obligations as part of addressing and assessing risks to child users, the regulator performed an extensive assessment of the nature, scope, context, and purpose of processing that Meta was carrying out. The findings were that the processing posed high risks to the rights and freedoms of Instagram's child users; in particular, the risk of communications from dangerous individuals (impersonators, fraudsters) and the fact that the features associated with having a business Instagram account such as being able to see analytics around engagement with the user's profile provided an incentive for children to switch from a personal to business account. 

Under Meta's terms of service, children under the age of 13 are prohibited from setting up and having an account on Instagram. However, there are no in-built age verification controls for this process; indeed, age verification is not typically required unless the user gives some indication that they are under the required age e.g. writing their age in their Instagram bio or what year they are in at school. 

Instagram published a new post on their announcements blog on 23 June 2022, which introduces the new methods they're trialling to verify the age of their users. These range from asking mutual followers of the user to vouch for how old they are to a new partnership with Yoti, which offers anonymous age estimation technology. 

  1. What impact does this have for UK organisations?

Although this decision does not have any direct impact on the UK, it very much aligns with the Information Commissioner's ("ICO") stance that protection of children’s data is a key concern. In the UK, the Children's Code is the key piece of guidance when it comes to children's data and one the ICO is now looking to actively enforce. Its main purpose is to require high privacy by default when it comes to children's data. In practice, this Code applies to both UK-based online service providers and non-UK based providers providing a service in the UK which is likely to be accessed by children, so it is wide-reaching.

While we haven’t seen many fines to date under the Children’s Code, the ICO is beginning to come alive in this area. See our article - 
Protecting children online: £27m intended fine signals ICO move from education to enforcement, Ali Vaziri (passle.net)) which explains the ICO's intention to fine another social media platform £27 million, for allegedly failing to protect children's privacy. The ICO are reportedly looking at over 50 other online services and their compliance with the Code, as well as launching a consultation to inform their review of the Code that is due in Autumn 2022, so it is clear children’s data is an area of regulatory focus and will be for the foreseeable future.

With heavy enforcement on the way from both the EU and the UK, if you haven’t already done so, now is a good time to evaluate whether you’re caught by the Children's Code, and if so, conduct a child-specific data protection impact assessment which will help you assess the risks and appropriate mitigations. For more information, see here.

Authors