This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

Freshfields TQ

Technology quotient - the ability of an individual, team or organization to harness the power of technology

| 5 minutes read

When algorithms decide – New EU data rules for digital labour platforms

After intense debates among EU legislators, the EU Parliament formally adopted the ‘Directive on improving working conditions in platform work’ (the Directive) on 24 April 2024. The Directive aims to regulate ‘platform work’, where an online platform serves as an intermediary between workers and paying clients. Platform work is a rapidly increasing segment of the EU economy, employing more than 28 million people in 2021 and projected to reach 43 million by 2025. 

We gave an overview of the Directive, and its general obligations, in a previous blog post. In this blog, we focus specifically on the Directive’s data-related rules concerning transparency, human oversight, safety and accountability for organisations using algorithms to manage labour on so-called ‘digital labour platforms’. We also explain the close connections between the Directive and other key EU data and tech laws, such as the EU’s General Data Protection Regulation (GDPR) and the (soon to be in force) AI Act.

Digital labour platforms regulated by the Directive

In various sectors, the operation of digital labour platforms has reshaped how work is organised and automated systems often (partly) decide key matters, such as working hours, promotions and terminations. The Directive aims to establish harmonised rules across the EU with regard to the use of automated systems. 

Digital labour platforms regulated by the Directive cover services which meet all the following criteria:

  • provided, at least partly, at a distance through electronic means (eg a website or mobile app);
  • provided at the request of a recipient of the service;
  • involves, as a necessary and essential component, the organisation of work performed by individuals in return for payment (irrespective of whether the work is performed online or in a certain location); and
  • involves the use of automated monitoring or decision-making systems.

The Directive therefore targets platforms which organise ‘gig work’ (such as organising driving or food delivery services) as well as those organising ‘crowdwork’, such as  programming or design services. Platforms that primarily exploit or share assets (eg scooters, bicycles and short-term rental of accommodation), or that enable individuals who are not professionals to resell goods, are out of the Directive’s scope. Whether a particular platform is subject to the Directive will require a case-by-case analysis.

Framework for algorithmic management

The Directive establishes a comprehensive framework for regulating algorithmic management of digital labour platforms that is closely connected to other data- and tech-related legislation. This applies in particular to the AI Act which refers to the Directive and expressly aims to strengthen the effectiveness of the rights and remedies granted under the Directive. 

Furthermore, the Directive’s rules relating to personal data are intended to be lex specialis to the GDPR (ie more specialist rules that override the more general rules in the GDPR in the case of conflict) by providing more specific safeguards for platform work. 

The most important takeaways of the parts of the Directive relating to algorithmic management and which overlap with the EU’s GDPR and the AI Act are:

  • Limitations on processing certain data categories: The processing of certain categories of personal data by means of automated monitoring or decision-making systems is prohibited – including the processing of: (1) biometric data; (2) data while the worker is not performing services; and (3)  data to infer racial or ethnic origin, migration status, political opinions, religious or philosophical beliefs, and sexuality. The Directive’s catalogue of prohibited data categories is the result of considerable expansions in the course of the legislative process. The approach taken in the Directive differs from the GDPR; the GDPR grants specific protections to a smaller list of data categories and does not ultimately prohibit their processing if certain criteria are met.
  • Processing based on consent: According to a Recital of the Directive, platforms ‘should not’ process personal data of platform workers on the basis of consent. The rationale behind that is the assumption that platform workers cannot freely give consent, considering the potential imbalance of power between the worker and the digital labour platform. The Directive thereby reflects the opinion of EU data protection authorities in their guidelines on consent as lawful basis. However, it is important to note that this does not completely exclude consent as a lawful basis for processing in the employment relationship.
  • Transparency: The Directive aims to ensure transparency regarding matters such as the aim of monitoring systems or the main parameters of decision-making systems (eg how the behaviour of a worker influences decisions). Platforms must therefore inform workers, their representatives and, upon request, certain authorities about the use of automated systems. 
  • New portability rights: Workers will have a specific right to portability of their personal data generated through their performance of work in the context of a digital labour platform’s automated systems (eg, ratings and reviews). The Directive’s right of portability is in addition to the right to data portability under the GDPR. 
  • Right to explanation and human review: The Directive grants workers the right to obtain an explanation for any decisions taken or supported by an automated decision-making system. The platform must provide access to a contact person with whom workers can discuss and clarify facts, circumstances and reasons which led to a specific decision.
  • Impact assessments: The Directive highlights that automated monitoring and decision-making systems used by digital labour platforms are likely to result in high risks for platform workers and that such platforms must generally carry out data protection impact assessments (DPIAs) in line with the GDPR. Platforms are required to seek the views of workers and potentially their representatives in connection with DPIAs. In addition, platforms must perform periodic assessments of the impact of decisions taken or supported by such automated systems – for example on the safety and health of workers.
  • Human oversight: Platforms must ensure sufficient human oversight of automated systems. In particular, the Directive requires that any decision that restricts, suspends or terminates the contractual relationship or the account of a worker or any other decision of equivalent detriment must be taken by a human being.

Private and public enforcement mechanisms

The Directive contains various private enforcement mechanisms. Workers will have rights to redress, including compensation for damages in certain instances. The Directive also includes measures to prevent adverse treatment or consequences resulting from individuals enforcing compliance with the Directive. Workers may not be dismissed, their employment agreement terminated or made subject to any equivalent action on the grounds that they have exercised the rights under the Directive. Digital labour platforms must also refrain from accessing or monitoring certain communication channels for workers.

In addition, the Directive establishes public enforcement mechanisms. Courts and authorities will have the ability to gain access to relevant evidence from the digital labour platforms. As far as data protection matters are concerned, data protection authorities are responsible for monitoring and enforcing the framework for algorithmic management. In this respect, the procedural framework of the GDPR will apply, leading to the risk of administrative fines of up to €20m  or, in the case of an undertaking, up to 4 % of the global annual turnover.

Next steps for affected companies

The Directive is expected to be formally adopted by the Council shortly and published in the Official Journal of the EU.

EU Member States will then have two years to transpose the provisions of the Directive into their national law. Companies should therefore begin examining whether they fall under the scope of the Directive, triggering the stringent algorithm management framework, and determine how to implement relevant rules.

As most of the new algorithm management framework obligations have interdependences with obligations under the GDPR, impacted organisations should review and, where appropriate, begin to update their existing GDPR compliance mechanisms with a view to the Directive.

In light of potential overlaps (eg with HR-related AI applications), the Directive should also be borne in mind where relevant during implementation projects for the EU’s AI Act.

Tags

data, data protection, eu ai act, eu digital strategy, gdpr, innovation, platforms