European Data Protection Board’s Opinion on Pay-Or-Consent Model for Social Media and how EDPB Undermines CJEU’s Meta Judgment

On April 17, the European Data Protection Board issued an opinion on social media companies’ recent practice of offering consumers the option to pay their way out of seeing ads. To understand what this opinion says and why it matters, it is important to think back to the 2023 CJEU Meta case which I have covered in this blog (see here). I will then summarise EDPB’s main points and clarify why I believe they are on the wrong track.

Update 1: The AG Opinion in C-446/21 Schrems: GDPR precludes the processing of personal data for the purposes of targeted advertising without restriction as to time or type of data. See here.

Update 2: The Commission has now opened proceedings on the legality of Meta’s pay-or-consent model under DMA. See here.

What Did Meta Say?

Meta case interprets Article 6 GDPR on lawfulness of processing. The success of the social media companies’ advertising model – their main source of revenue – depends on a constant stream of users and the ability to keep them glued to the screen so that the right ads are served to the right audience. In order for that to happen, social media companies gather data. Not only data that users directly give them access to through consent but everything they can get their hands on such as cookies left on PCs, data from other platforms (cf. DMA Article 5(2)b), etc. It is these practices that have been the subject of a number of national enforcement actions and ultimately resulted in the Meta judgment (for a detailed overview of the (i)legality of surveillance advertising in the EU see L. Zard’s article here). The main contribution of the case is therefore in clarifying when platforms can lawfully process data for targeted/behavioural advertising in cases where a specific consent has not been given. The legal bases typically used in such cases are legitimate consent and contractual performance. It is in interpreting the lawfulness of the last two that we find the case’s main contribution.

In relation to the contractual performance, the Court’s key finding comes in paragraph 101: “personalised content does not appear to be necessary in order to offer that user the services of the online social network.” In other words, personalised content might be useful but it is not part of the contractual relationship with the user and cannot be used as a valid basis.

In relation to legitimate interest, the key finding revolves around whether personal data is processed in circumstances where data subjects “do not reasonably expect such processing”. While “the processing of personal data for direct marketing purposes may be regarded as carried out for a legitimate interest” as per GDPR Recital 47, this always has to be balanced with users’ fundamental rights. This cannot be the case where “data subjects do not reasonably expect such processing” (para 112). In other words, Facebook’s potentially legitimate interest from Recital 47 is trumped by the innocent and potentially deceived user’s fundamental right to privacy. The corollary is that properly informed user may be subject to legitimate interest processing.

Crucially, the Court says

117 (…) despite the fact that the services of an online social network such as Facebook are free of charge, the user of that network cannot reasonably expect that the operator of the social network will process that user’s personal data, without his or her consent, for the purposes of personalised advertising.

and further down:

150 (…) users must be free to refuse individually, in the context of the contractual process, to give their consent to particular data processing operations not necessary for the performance of the contract, without being obliged to refrain entirely from using the service offered by the online social network operator, which means that those users are to be offered, if necessary for an appropriate fee, an equivalent alternative not accompanied by such data processing operations.

CJEU obviously requires that clear consent be the main basis for processing and that the alternatives cannot be the withdrawal of access to the service. In other words, a consumer cannot be forced to withdraw from the service because it refuses to be subjected to unnecessary data processing. The Court’s suggestion that “appropriate fee” might be the way out of this difficulty does not indicate that a payment model releases Meta from GDPR obligations but that Meta should explore different revenue models. Meta, in other words, still needs to be GDPR compliant to its non-paying customers.

Meta’s response has, indeed, been to introduce a pay-or-consent model, allowing users who did not wish to see ads to subscribe to a plan which allows them not to see any. The basic effect of the payment, though, was not the cessation of data collection for paying customers but the lack of ads on users’ screens. In other words, Meta continues to gather private info about both categories of users, using them for behavioural advertising for the non-paying ones and for other purposes for the paying ones.

The Board was asked to opine on valid consent in light of the Meta judgment.

EDPB’s April Opinion

The Opinion is issued pursuant to Article 64(2) GDPR. The Board’s opinions are binding in that they should be taken “utmost account of” in national proceedings. While this is true, a “case-by-case assessment of the [GDPR] criteria” remains necessary and the Opinion is only a “framework for controllers and SAs to assess the validity of consent in ‘consent or pay’ models.” (Recital 33).

The Board draws attention to a number of well-known principles of GDPR. In relation to purpose-binding: “Even if processing is consent-based, it does not justify collecting personal data beyond what is necessary for the specified purpose or in a manner that is unfair to the data subjects.” In relation to purpose limitation and data minimisation: could the purposes have been achieved by less intrusive means? In relation to data protection by design and default (Article 25): are the protective features built into the product?

A surprising and controversial statement is found in paragraph 130 where the Board says that “personal data cannot be considered as a tradeable commodity”. This statement makes little sense and is against EU law. Directive 2019/770 on digital content explicitly allows payment with data in Article 3. Furthermore, if it cannot be considered as tradeable, why spend 40 pages explaining which trading models are GDPR compliant? Since GDPR does not explicitly cover this, and since other (civil) laws do, the only right conclusion must be that personal data is a tradeable commodity insofar as such trade does not contravene GDPR.

The key part comes in Section 4.2.1.1 on “a free alternative without behavioural advertising”. The Board opines that it is not enough to offer only a paid alternative for the consent to otherwise cover data collection lawfully. A free alternative with a “different form of advertising involving the processing of less (or no) personal data, e.g. contextual or general advertising or advertising based on topics the data subject selected from a list of topics of interests” could also be in play. The Board’s key insight (cf. paragraph 77) is that companies’ are pushing for a behavioural instead of alternative forms of advertising and that this coercive behaviour is what makes their practices unlawful. In other words, an impression is created that behavioural advertising is the only possible way of generating revenue.

There are two main contributions that the Board makes to the understanding of valid consent in ‘pay-or-consent models’ and which explain the national authorities’ role in monitoring social platform’s cooperation. One relates to fee monitoring and the other to the requirements of valid consent.

Appropriate Fee

The main contribution on fee monitoring comes in paragraph 132:

[C]ontrollers should assess, on a case-by-case basis, both whether a fee is appropriate at all and what amount is appropriate in the given circumstances, bearing in mind the requirements of valid consent under the GDPR as well as the need of preventing the fundamental right to data protection from being transformed into a feature that data subjects have to pay to enjoy, or a premium feature reserved for the wealthy or the well-off.

It is here that the problems arise. The Board seems to believe that national data authorities should have the ultimate say on whether the payment is fair or not:

137

While it is for controllers to set the amount of a fee in itself, if supervisory authorities find that consent is not freely given or that the accountability principle has not been complied with, they can intervene and impose corrective measures. In this respect, they are competent to review or evaluate the assessment of appropriateness carried out by controllers.

This cannot be right. National authorities cannot possibly assess the fairness of a fee in the context of a non-regulated free market activity outside of antitrust context. At best, they can assess whether the manner in which the fee is presented or administered affects the way in which data subjects give their consent or refuse to do so. It may very well be true that a deceitful presentation of a payment option (such as “only payment would ensure that your data is protected”) is unlawful. It may also be true that social media companies are deliberately ambiguous about whether payment changes the conditions under which data is collected. But none of this removes the main problem which has from the very start been in the focus of this discussion and which the Board’s model does not address: Meta gathers too much data and it would continue to do so under any payment model unless it is kept in check.

The Board itself pays lip service to the free market model in paragraph 136:

Businesses are free to set their own prices and choose how their revenue models are structured, but this right should be balanced with the fundamental right for individuals to protection of their personal data.

Some rudimentary and confusing ideas about how this assessment can proceed are then presented in an abbreviated form in paragraphs 137 and 138. It seems that the Board is saying that national authorities should only react where they believe that fees have an adverse impact on consent. While the assessment of consent validity cannot be outsourced, authorities from other fields, such as consumer or competition authorities, may be consulted. It is not clear what these are supposed to contribute with. If the matter is in their competence, why consult data protection authorities on violations outside of those authorities’ competence? If the matter is with the data protection authorities, as the Board believes in paragraph 134, why use competition authorities?

The guidelines for assessment of fee fairness are lacking. “Fairness” essentially means “equal treatment”. This obscures the main problem which is not whether different categories of users are treated differently in whether they can afford payment but whether Meta can collect more data than it needs and is entitled to.

Valid Consent

The Board makes a number of relevant observations on consent. The assessment of what constitutes valid consent is a matter for data protection law and not for business models or opinion. The Board also emphasises that payment cannot detract users from giving or withholding consent. User cannot be threatened with the withdrawal of services if payment is not made but the choice of whether payment is made or not should be genuine. These are valid observations. Furthermore, The Board’s observations on “granularity” are also important:

When presented with a ‘consent or pay’ model, the data subject should be free to choose the individual purpose(s) they accept, rather than having to consent to a bundle of processing purposes.

Not only is this one of the main points of the Meta judgment but is also supported in Recital 43 GDPR.

The remainder of the discussion on consent (informed consent, transparency, specific consent…) does not bring surprises but must be seen as reiteration of the well-known albeit important GDPR points.

The key controversy here seems to be the Board’s view that “it will not be possible for large online platforms to comply with the requirements for valid consent if they confront users only with a binary choice between consenting to processing of personal data for behavioural advertising purposes and paying a fee” (paragraph 179.) This seems to suggest that the validity of consent depends on whether the choice is binary or not. It does not. Meta can be fully GDPR-compliant and still maintain a binary model. Moreover, this creates the false impression that the mere presence of different consent and payment models (only one of which is full payment) is in itself creating compliance. It does not.

Concluding Remarks

Meta judgment introduced some valuable points about how social media can obtain valid consent. Unfortunately, the Board has managed to get almost everything else wrong.

First, in concentrating on how the the fees should be monitored the Board misses the main point which is clearly stated in the Meta judgment: no matter what revenue model the company relies on, it has to be GDPR compliant. It is furthermore not national boards’ task to conduct a fairness assessment. This is not just because they are not suited for the job (we do not even need a law&economics specialist to indicate why this would be inefficient) but because this assessment is not needed nor called for in GDPR. What is necessary is to indicate to Meta which of its data collection practices are unlawful under whichever payment model it chooses.

Second, the Board’s idea that personal data is not tradeable is plain nonsense. While its insight that the fundamental right to data protection cannot be transformed into a feature that data subjects have to pay to enjoy is welcome (albeit obvious), this is not the same as saying that fully informed adult data subjects cannot trade their personal information. Not only do they already do so but the Board itself gives plenty of insight in the Opinion on how to be better at it.

Third, the “equivalent alternative” to payment model line of reasoning may very well be true and desirable but is not based in GDPR which remains mute on this topic. To insist furthermore that valid consent depends on this alternative being available goes beyond creative interpretation. GDPR gives no basis for such reasoning.

Fourth, it is already being discussed whether EDPB may have overstepped its Article 64 (2) GDPR competences. They may very well have. The Board’s opinions do have binding force on national authorities and the confusing way in which this one is formulated makes matters worse but its introduction of the “equivalent alternative” theory has no basis in GDPR.

On a more positive note, the Board’s main conclusions about pay-or-consent models are useful and are what one would expect:

  • consent in ‘pay-or-consent’ models must be valid and GDPR contains all that we need to know about what constitutes valid cosent
  • obtaining consent does not absolve large online platforms from complying with the other rules and principles provided by the GDPR

That Meta and social media companies have created business models that are both harmful and addictive is beyond debate. That Meta and others subvert data protection mechanisms (on scam adverts see here and here) for financial gain is also true. I have personally sent dozens of requests for removal of scam ads based on fake news. In every single case Facebook refused to remove them even though there was no doubt about the nature of the ads. You cannot be a company with 135 billion in revenue and claim that you are having difficulties removing fraudulent ads but hopefully DSA enforcement will soon be able to help. What is at stake, however, is the role GDPR can play in solving some of these problems and the extent to which GDPR interpretation as opposed to drafting new laws is the right answer. Meta judgment charted a way forward in saying that clear consent is the only way of lawfully processing the multitude of data. the Board’s confused theories on fee monitoring, illegal data trading and multiple payment and consent models will only make matters worse. A review of GDPR may very well be in order.

One thought on “European Data Protection Board’s Opinion on Pay-Or-Consent Model for Social Media and how EDPB Undermines CJEU’s Meta Judgment”

Leave a comment