The EU Directive on improving working conditions in platform work, Directive (EU) 2024/2831 (Directive), was published in the Official Journal of the EU on 11 November 2024, and entered into force on 1 December 2024.
We have explored the key aspects of the Directive, such as the presumption of employment, collective bargaining, algorithmic monitoring and transparency, as well as the implications of these aspects in our previous blog post, and we invite you to read through some further practical insights on these points below. Moreover, in this blog post, we analyse the potential spill-over effects of the Directive for employees and employers, as well as how its implementation will affect and interrelate with the rules set out in the EU AI Act.
Key aspects of the Directive
Determination of employment status
The Directive introduces the presumption of employment, according to which the contractual relationship of a person performing platform work is presumed to have an employment relationship. While the initial draft of the Directive and the agreement reached between the EU institutions in December 2023 included a list of criteria that would indicate control (and, in turn, an employment relationship), the adopted Directive now leaves it to the Member States to determine whether or not an employment relationship exists. While the Directive is overall a step forward because it harmonises certain aspects of platform work, when it comes to employment status of individuals performing platform work, it is a step back compared to the initial proposal, with the consequence that it still leaves a fragmented landscape for digital platforms that will have to keep track of the different classification criteria that apply across Member States.
Moreover, the presumption of employment is not good news for platforms, as it means a reversal of the burden of proof, which requires platforms to provide evidence to support their argument that their workers are self-employed.
Algorithmic management and transparency
The Directive establishes information requirements regarding the use of automated monitoring and decision-making systems. It also requires human oversight of automated decisions, to avoid worker discrimination that might result due to elements that the algorithm could not take into account, for example family reasons, age, religion etc. This has been shown in case law in the recent years, as outlined in our previous blog post. This requirement is consistent with the approach taken in other pieces of EU legislation, such as the AI Act and the GDPR. You can read more about this aspect of the Directive in our previous blog post. It is also worth noting that complying with the Directive will probably be particularly challenging for small platforms that do not have the resources to fulfil such requirements. Furthermore, complying with all transparency requirements is likely to be challenging for all platforms, given their business models rely on algorithms that they have developed and where disclosing certain details of such systems might lead to disclosing trade secrets.
In addition, platform workers have the right to an explanation of decisions made by algorithmic systems and the right to request that the platform reviews the decision. Moreover, the Directive provides for protection against adverse treatment and against dismissal of the persons charged by the platforms with the function of oversight of automated monitoring systems and automated decision-making systems. While this is good news for the persons that will hold this role, it requires the companies to consider how to address breaches by protected persons.
Involvement of workers’ representatives
The Directive requires the Member States to promote the role of the social partners and to encourage the collective bargaining rights for platform workers. This is in line with the right of solo self-employed workers to collective bargain, without being considered to breach the competition law requirements, recognised by European Commission’s guidelines, published in September 2022. Even though the latter are non-binding, the Commission is counting on the cooperation of the local competition authorities.
With respect to such right to collectively bargain, the current landscape is quite fragmented. For instance, in France, representatives of platform workers, unions, courier associates have entered into a number of collective agreements with the platforms. Germany has limited collective bargaining rights for non-employees, whereas in Belgium collective bargaining is restricted only to employers and employees. In Spain, the rights of works councils related to automated decisions have been strengthened with the adoption of the Rider law. Going forward, platforms should monitor to what extent the Directive and the guidelines will change the local systems, and enhance collective bargaining. It remains to be seen to which extent the labour unions will influence the legislative process and what impact they will have.
Catch-all clause
In cases where there is a contractual relationship between the person performing platform work and an intermediary (and not the platform itself), the Member States are to establish appropriate measures to ensure that the individuals concerned enjoy the same protection under the Directive as those who are in a direct contractual relationship with a platform. Joint and several liability may also be considered here.
Platform Workers Directive vis-a-vis AI Act
Although the AI Act is directly applicable and does not require transposition into national law, it allows national legislators to create more favourable provisions for the protection of employees' rights.
When platform workers are classified as employees, the above-mentioned possibility will play an important role in the transposition of the Directive into national law, as some of its requirements go beyond the requirements of the AI Act. For example, the Directive contains specific prohibitions and restrictions on the processing of personal data by algorithmic management and decision-making systems that are stricter than the so-called "prohibited AI practices" under the AI Act.
This is noteworthy because it results in platform workers being afforded more protection than (other) employees under the AI Act. In practice, it will be necessary to keep an eye on whether this Directive will have a spill-over effect, and whether Member States will implement these rules only for platform workers or for employees generally.
Spill-over effect of the Directive
It is likely that certain provisions of the Directive will be incorporated by Member States into general labour law, thus changing the applicable rules for all employers. A few examples as follows:
- Legal presumption of employment applicable in other cases: This procedure could also be introduced for non-platform contractual relationships, avoiding uncertainty about the applicability of the Directive and provide protection for employees overall. In Germany, for example, there is already a procedure for determining employee status for social security purposes (in a legally not-binding form). It will be interesting to see in which (other) areas this procedure will be applied and what criteria will be used for classification. The Directive provides that the legal presumption applies in all relevant administrative or judicial proceedings in which the determination of the correct employment status of a person performing platform work is at issue, i.e. in particular in employment law proceedings.
- Joint and several liability: Where the contractual relationship involves an intermediary, there is a risk that joint and several liability for certain obligations will also be created for employers who are not themselves platforms.
- Data protection: It is possible that Member States will extend the scope of transparency obligations regarding the use of automated monitoring systems or automated decision-making systems to other categories of workers.
What’s next?
Platforms, and generally all employers should closely monitor the implementation of the Directive into national laws. This will involve observing any additional requirements imposed by Member States, which may extend beyond the AI Act's provisions. Employers and platforms must be prepared to adjust their practices to comply with new regulations, particularly in terms of data protection and transparency, joint and several liability, and the legal presumption of employment. As mentioned above, the role of trade unions and collective bargaining, when platform workers are classified as solo self-employed workers, should also be considered. Staying informed about these changes will be crucial for employers to ensure compliance.