Click here to view Case C-252/21.
Background
To use its online social network, on sign-up users were required to accept the platform’s general terms. User data provided on sign-up were linked to other data collected from user activity both on and off the social network. The off-network data included data about: (1) visits to third-party websites and apps that use the platform’s advertising technologies; and (2) use of other online services provided by the platform’s group (e.g. other social media and instant messaging services). These profiles enabled detailed inferences to be drawn on user preferences and interests.
A competition authority in Germany prohibited the platform from making, in its general terms, the use of its social network conditional on processing users’ off-network data and from processing data without consent. It also required the platform to change its terms to make clear that those data will not be collected or linked with user accounts or used without user consent.
Having emphasised that consent was not valid where it is a condition for the use of the social network, the authority’s decision was based on its view that this processing did not comply with the GDPR and, therefore, constituted an abuse of the platform’s dominant position in the online social network market. The platform challenged the decision and the appeal court referred various questions to the CJEU.
The decision
The main themes that relate to personalised advertising are as follows:
1. Competence – a competition authority can make findings about GDPR compliance in the context of examining abuse of a dominant position. However, the competition authority is bound by decisions of data protection authorities and must cooperate ‘sincerely’ with them.
2. Special category data – where users visit or enter information into (when making purchases or registering on) websites or apps which relate to special categories of data listed at Article 9(1) of the GDPR (e.g. “flirting apps, gay dating sites, political party websites or health-related websites”), data about such visits or information are special category data. Therefore, where those data are collected through integrated interfaces, cookies or similar storage technologies and then linked to a user account, this will be considered processing special category data which is prohibited unless a derogation applies (e.g. ‘manifestly made public’ at Article 9(2)(e) of the GDPR).
3. Manifestly made public – merely visiting such websites or apps does not mean that the user has manifestly made public those special categories of data related to that visit. Where a user enters information into those websites or apps, uses integrated ‘like’ or ‘share’ buttons, or logs on to those websites or apps using credentials linked to their social media account / telephone number / email address, this action can potentially mean that the user manifestly makes public special categories of data. But this will only be the case where the user has explicitly expressed their choice beforehand (e.g. through individual settings selected with full knowledge of the facts) to make their data publicly accessible to an unlimited number of people or, in the absence of such settings, with their explicit consent.
4. Contractual necessity – collecting off-network data and linking it to users’ accounts for subsequent use is only necessary for the performance of the contract with those users if that processing is objectively indispensable for achieving a purpose forming an integral part of the contractual service intended for those users. In other words, the main object of the contract must not be achievable in the absence of that processing. Personalisation of content might be useful, but in this case the Court considered that it did not appear necessary in order to offer the social network services in question.
5. Legitimate interests – recital 47 of the GDPR recognises that processing of personal data for direct marketing can potentially be carried out in the controller’s legitimate interests. However, those interests have to be balanced against, and must not override, the rights of users. In that balancing exercise, it is necessary to pay particular attention where the data subject is a child because recital 38 recognises that they merit specific protection – particularly when marketing or creating user profiles or offering services aimed directly at them. Therefore, in this case the balance tipped in favour of the users given:
a. their reasonable expectations – although the social network is free of charge, users would not reasonably expect the platform to process their personal data without their consent for the purposes of personalised advertising;
b. the scale of the processing – it is particularly extensive as it relates to potentially unlimited data;
c. the impact on them – it has a significant impact on users given that a large part (if not almost all) of their online activities are monitored by the platform “which may give rise to the feeling that his or her private life is being continuously monitored”.
6. Consent – being in a dominant position does not automatically invalidate consent. It is, however, an important factor in determining its validity, particularly as it is liable to affect the freedom of choice of users and to create a manifest imbalance between them and the platform. Users should be able to refuse specific data processing operations which are not necessary for the performance of the contract without being forced to stop using the social network, and equivalent alternative services should be offered to the user such as a paid-for version. In respect of off-network data, given the expectations, scale and impact of the processing on users, separate consent is required.
Comment
Many of the issues at the heart of this decision will already be familiar to EU regulators such as the Ireland Data Protection Commission (‘IDPC’). Only earlier this year did the Commissioner conclude two inquiries on the lawful basis for behavioural advertising. As the IDPC explains on its blog, it initially took the view that “personalised services that also feature personalised advertising” was a “reality … central to the bargain struck between users and their chosen service provider, and forms part of the contract concluded at the point at which users accept the Terms of Service.” However, since other regulators disagreed during the consultation process, the EDPB intervened. It determined, as a matter of principle, that the platform was not entitled to rely on contractual necessity as the legal basis for its processing of personal data for the purpose of behavioural advertising.
But on the issue of so-called ‘forced consent’ (i.e. access to services being conditional on user-acceptance of the terms), the outcome ultimately was a decision by the IDPC that: “the legal basis for processing of personal data under the Terms of Service […] does not, as a matter of law, have to be consent under Article 6(1)(a) GDPR […] ” (see para 3.26). But clearly that outcome is now at odds with this decision by the CJEU. It is now hard to think that the Court considers any lawful basis other than consent might be appropriate when it comes to behavioural advertising by platforms.
No surprise then that Norway’s Datatilsynet, perhaps buoyed by the CJEU decision, has acted on its previously expressed concerns by issuing a temporary three month ban (running from August until October) on the platform’s processing of personal data of data subjects in Norway for behavioural advertising where the platform relies on contractual necessity or legitimate interests. This has been described by Max Schrem’s noyb as “an attempt to bypass the Irish DPC, which wasn’t enforcing its own decision … 5 years after noyb’s complaints”. Whether any of the other regulators that disagreed during the IDPC’s consultation process will follow suit remains to be seen. The Datatilsynet has also indicated that it may take the matter to the EDPB after the summer in order to seek an extension of the ban; and that the platform can, meanwhile, challenge the regulator’s decision in the local courts.
Another contentious issue touched on in the decision is the CJEU’s apparent endorsement of ‘pay or okay’ models, with non-consenting users “to be offered, if necessary for an appropriate fee, an equivalent alternative not accompanied by such data processing operations”. In other words, users should be able to choose between their data being used or buying a subscription. However, the appropriateness of any such fee is a vexed question, and one which continues to exercise noyb whose complaint is as follows: “An increasing amount of websites asks their users to either agree to data being passed on to hundreds of tracking companies (which generates a few cents of revenue for the website) or take out a subscription (for up to € 80 per year). Can consent be considered “freely given” if the alternative is to pay 10, 20 or 100 times the market price of your data to keep it to yourself?” In a recent blog, Schrems suggests that this issue may ultimately need to go up to the EU’s highest court.
So, without even touching on other issues such as the lawful basis for sharing with law enforcement, there is still much to unpack in this decision as it goes back down to the referring competition authority.