This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

Freshfields TQ

Technology quotient - the ability of an individual, team or organization to harness the power of technology

| 5 minutes read

Robot Recruitment under German employment law – Personnel selection with the help of artificial intelligence

The importance of artificial intelligence (AI) in the employment relationship is on the rise. Already frequently used in practice are so called robot recruitment tools, intended to simplify and improve the current resource-intensive processes for screening and selecting applicants.

The possible uses of AI in the recruitment process are diverse, ranging from the search for suitable applicants on the Internet, the sorting and preparation of application documents, checking candidates regarding the formal requirements of the position, the evaluation of publicly available information about applicants in (job-related) social networks and prediction of the suitability for a particular job. AI can create personality profiles (people analytics) and chatbots can be used to conduct initial telephone interviews. Finally, AI can generate autonomous decisions to select the most suitable applicant and create the employment contract or an automatically generated rejection to the remaining applicants. But the use of AI in the recruitment process includes legal challenges, particularly concerning risk of discrimination and data protection breaches. 

1.       Risk of discrimination

The use of AI systems – although intended to make objective decisions – does not prevent the risk of direct or indirect discrimination against applicants based on protected characteristics under the German General Equal Treatment Act (Allgemeines Gleichbehandlungsgesetz), such as ethnic origin, gender or age. Typical examples are:

  • An AI system may draw logical but discriminatory conclusions as part of the self-learning process due to insufficient programming or an inadequate data basis, even if this was not the operator's intention (e.g., an algorithm predominantly trained with data from male applicants can result in discrimination against female applicants).
  • Prejudiced mindsets of programmers or users can make the systems susceptible to discrimination
  • In case of automated language analysis with AI systems, persons with a speech disability or non-native speakers may not be able to be appropriately analyzed by the system (due to differences e.g. in terms of speech rhythm or sentence structure).

2.       Data protection 

Following a decision by the ECJ (30 March 2023 - C-34/21) on the partial invalidity of the employment law related data provision in the German Federal Data Protection Act (Sec. 26 BDSG) employers should instead base the justification of data processing in the application process on Art. 6 GDPR et seqq. which set out the potential bases and requirements for processing of data.

2.1     Analysis of social media profiles 

Career-related social networks have become an important recruitment tool. The collection and use of personal data on publicly accessible career-related social media profiles for application purposes is considered appropriate under German data protection regulations. It is prevailing opinion that such data processing is also permissible if the evaluation is not carried out by a human but by an AI system.
In the case of leisure-oriented social networks, the prevailing opinion is that the collection and use of such personal data is not permissible under data protection law. This also applies if the profile owner has made the information available to the public. The fact that private content is increasingly being mixed with professional interests has not (yet) led to a change of the legal opinion. 

2.2      People Analytics

The use of AI as part of the personality analysis of applicants must be limited to what is necessary for the position, which means that the creation of comprehensive personality profiles is not allowed. The processing of information, for example on qualifications and professional suitability of the applicant can also be processed by an AI system. However, the use of AI systems must not result in generally impermissible total surveillance. In addition, certain types of surveillance may also be generally excluded, for example, according to the press release of the European Commission on the AI Act (see also below), emotion recognition in the workplace has been identified as unacceptable risk which violates fundamental rights and will therefore be banned.

2.3        Automated decisions 

Pursuant to Art. 22 para. 1 GDPR, applicants or employees have the right not to be subject to a decision based solely on automated processing. An automated decision exists if the AI algorithm evaluates applicants according to a scoring system and the applicants are informed directly of the system's decision. It is however permitted that the AI system prepares a decision, e.g. by transmitting the score values in the form of a ranking for a human decision-maker. In a recent decision on the SCHUFA score (a person’s credit score from German credit rating institution SCHUFA), the ECJ ruled that the scoring can constitute automated decision making according to Art. 22 para. 1 GDPR if a third party relies on the credit score. The Hamburg Commissioner for Data Protection has taken the position that the ECJ’s reasoning in the SCHUFA decision also applies to AI systems, especially if AI is used to pre-sort applications of candidates. According to the Commissioner, the human decision-maker needs expertise and enough time to be able to scrutinise the machine-generated preliminary decision. This not only increases the company's burden of proof with regard to the required expertise of the decision-maker in disputes over AI-based pre-screening of applicants. It also raises the question as to whether the use of AI can still achieve the hoped-for (considerable) time and cost savings if a comprehensive review of the machine-generated preliminary decision is required.

An exception to the prohibition of automated decision-making applies if the automated decision is necessary for the conclusion or performance of an employment contract (Art. 22 para. 2 lit. a GDPR). This may apply if HR departments reach the limits of capacity and are forced to reject individual applications to cope with a permanently very high number of applicants. In the absence of court decisions, it is not yet clear whether this interpretation will be confirmed.

3.         Co- determination

The works council's co-determination rights must be observed when introducing and using AI in application procedures. According to the German Works Constitution Act, the works council must be – inter alia – informed of the use of AI in good time and receive necessary documents. Furthermore, there is a right to consultation regarding the use of AI in the context of work procedures and work processes. Approval requirements may apply for the use of AI generated systems.

4.         Outlook

Regulatory requirements in this area will continue to increase. German federal ministries announced this spring an initiative for a law on employee data protection, including the regulation of the use of AI in this context, but there is still no draft law. At European level, the published drafts of the AI Act and the Platform Work Directive also contain provisions on the use of AI systems. After the EU institutions reached a political agreement on the AI Act on 9 December 2023, they currently negotiate the final wording of the AI Act. According to the press release of the European Commission, AI systems for the recruitment of people were identified as AI systems with high-risk. Developers of such AI systems will be required to comply with strict requirements, including risk-mitigation systems, high quality of data sets, detailed documentation, clear user information, human oversight, and a high level of robustness, accuracy and cybersecurity. Obligations for deployers and users of AI systems for the recruitment of people are likely to be included in the final wording of the AI Act.