This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

Freshfields TQ

Technology quotient - the ability of an individual, team or organization to harness the power of technology

| 5 minute read

Digital Fairness Fitness Check Part 3: Personalisation - why is it considered problematic and what countermeasures may companies be facing?

Our blog series focusses on the European Commission’s Digital Fairness Fitness Check and a potential future Digital Fairness Act which may be on the horizon. In the second part of our series, we took a closer look at the results of the Digital Fitness Check concerning ‘Addictive Design and Gaming’. 

Another practice which was assessed by the Digital Fairness Fitness Check and which may be addressed by future EU legislation is ‘Personalisation’. This is good reason to unpack what is meant by ‘Personalisation’, why it is considered problematic in some cases, what the results of the Fitness Check are and which countermeasures from the EU legislature companies may be facing soon?

What is Personalisation?

The Fitness Check report understands ‘Personalisation’ as customising online experiences based on consumer data, which includes practices such as targeted advertising, rankings, and recommendations adjusted to individual preferences and behaviour and customer-specific pricing. 

As a starting point, the European Commission (EC) expressly recognises that personalisation can benefit consumers by offering product suggestions and discounts suited to fit their interests. The EC notes, however, that some practices may also raise concerns about data privacy, potential exploitation of consumer vulnerabilities, and lack of transparency of the data of the users. 

For example, some personalisation practices have been found to hide how data is collected, used and shared. Additionally, personalisation may limit consumers' choices by showing them a narrow selection of products at tailored prices, rather than presenting the most competitive options.

What are the main results of the Digital Fitness Check?

The Digital Fairness Fitness Check revealed disagreement between stakeholders as to whether personalisation benefitted consumers overall and whether the current legal framework was sufficient to address potential issues.

As we have discussed in our blog on Dark Patterns, the evidence gathered as part of the Digital Fairness Fitness Check may sometimes not be sufficiently robust to draw definite conclusions. This particularly applies to  the assessment of ‘Personalisation’: 

While the report concludes that personalised advertising, ranking, and recommendations are widely used, the description of issues with these practices remains anecdotal in many aspects. With regard to the specific practice of personalised pricing, the report finds that ‘evidence is still emerging’ and that the EC’s 2018 and 2022 studies did not find consistent and systematic evidence of personalised pricing. 

The report notes, however, that many consumers are concerned about certain personalisation practices. According to a 2023 survey cited by the EC, 70% of respondents expressed concern about the extent of their data’s use for personalisation, a notable increase from previous years. Additionally, 74% believed their data was misused to personalise offers, with many reporting to be unaware of the extent to which their online activity affected content and pricing recommendations.

The consumer survey carried out specifically for the Digital Fairness Fitness Check yielded comparable results. It found that 41% of consumers considered it difficult to understand how their personal data was used. 37% had the impression that companies had knowledge about their vulnerabilities and used it for commercial purposes. Transparency was also an issue for consumers. 34% of survey participants stated they had no possibility to opt out of personalisation, and a similar proportion found it challenging to understand or adjust their data preferences.

Existing legal framework and perceived shortcomings

The EC’s report correctly notes that a variety of personalisation practices are already covered by existing legislation such as the General Data Protection Regulation (GDPR), the ePrivacy Directive, the Digital Service Act (DSA), the Digital Markets Act (DMA), the AI Act, the Unfair Commercial Practice Directive (UCP Directive), the Consumer Rights Directive (CRD), or the Audiovisual and Media Services Directive (AVMSD). 

  • The GDPR, for instance, contains the principle of fairness and transparency potentially breached by manipulative or opaque personalisation practices. Moreover, the GDPR prohibits certain forms of automated decision making and profiling without the data subject’s explicit consent. 
  • The DSA strengthens the required level of transparency towards consumers and implements a prohibition to present personalised advertising based on profiling using special categories of personal data under the GDPR towards minors. Additionally, the DSA obliges platforms to adhere to increased transparency in their T&Cs by including information on the main parameters on which their recommender system is based. This information obligation comprises transparency on the options which consumers have to modify or influence those parameters. 
  • The CRD requires the disclosure of the presence of personalised pricing, although without a detailed explanation on how personalisation is applied specifically.
  • The AI Act also prohibits certain techniques in personalisation that exploit consumers' vulnerabilities but does not cover all forms of personalisation.

The Fitness Check report finds, however, that “[d]espite the concerns that consumers expressed about profiling and data use, B2C personalisation practices are not per se unfair or illegal” as far as the existing legal framework is complied with. In light of the existing legal framework, this finding is indeed not surprising. Moreover, given the lack of evidence that personalisation is detrimental to consumers per se, the finding that personalisation is not illegal per se does also not seem concerning. 

Still, the EC’s report concludes that “in its current form, EU consumer law cannot be considered sufficiently effective or clear in addressing the multifaceted concerns regarding commercial personalisation.” It concedes, however, that an effective response “would require further assessment under both consumer protection and data protection frameworks.”

Taking action: Potential pool countermeasures against unfair personalisation practices

Given the inconclusive outcome of the report, it is not clear which additional specific steps will be considered by the EC regarding ‘Personalisation’. As a starting point, the report describes a number of potential measures which were proposed by different stakeholders:

  • Since certain obligations – such as the prohibition of targeted advertising towards minors or based on sensitive data under the DSA – do not apply universally, some stakeholders called for applying similar obligations to all traders;
  • An amendment of the UPCD’s blacklist which lists practices which are illegal per se to include practices using psychographic profiling or similar techniques that create pressure and exploit personal vulnerabilities was also discussed;
  • Moreover, a more explicit option to receive non-personalised commercial offers instead of personalised ones was proposed;
  • The suggestions on personalised pricing were particularly extensive. While BEUC, an umbrella organisation for European consumer protection associations, called for a prohibition of personalised pricing based on behavioural predictions (with narrow exceptions), an 2022 EP study suggested measures such as the prohibition of personalised price increases (while allowing personalised discounts), prohibitions related to certain industries or certain criteria (beyond anti-discrimination law) as well as reinforced transparency obligations in cases of permissible personalised pricing.

It remains to be seen which (if any) of these measures can prove feasible in practice and which of them (if any) may become part of a potential Digital Fairness Act. However, it is likely that personalised commercial practices will continue to gain further relevance in the digital sphere (also because they increasingly become cheaper and easier to implement). This will likely lead to a call for additional regulation, at least regarding certain boundaries and transparency obligations. It should also be remembered that the DSA for example is up for review in 2025, and there may be opportunities to amend the existing rulebook to tackle some aspects of personalisation as raised in the fitness check report. This also comes at a time when enforcement of the already existing rules is  very high on the agenda of the new Commission, and industry is increasingly advocating for heightened attention to simplification and reduction of regulatory burdens and overlaps. In any event, stakeholders are well-advised to closely monitor the developments and align their own business practices with existing legislation and possible reforms. 

From ‘Personalisation’ to ‘Social media commerce and influencer marketing’: Stay tuned for the next installment of our blog series!

Tags

eu digital fairness series, eu digital strategy