This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

Freshfields TQ

Technology quotient - the ability of an individual, team or organization to harness the power of technology

| 2 minutes read

Draft EU AI Act’s potential impact on the financial sector

In April 2021, the EU Commission published a draft regulation on artificial intelligence. Since then, numerous opinions have been issued, amendments have been tabled and the Slovenian and French presidencies of the EU Council recently published compromise proposals.

The adoption of the final draft, which cannot realistically be expected before the end of 2023, will introduce substantial new obligations for providers and operators of AI-based technologies.

The AI Act will affect companies across all industries. In particular, financial sector players have been following the legislative process closely to determine whether their business models (e.g. algorithmic trading) may be affected by the Act.

Limited impact on core business?

As the draft AI Act stands, most financial institutions’ core business will not be affected because the AI Act’s most onerous provisions do not apply to every AI-based system, but only to those which the Commission has identified as being “high-risk”.

Currently, as regards the financial sector, only AI systems used for credit scoring may be categorised as high-risk (although the debate is ongoing). If the draft remains unchanged, only credit bureaus and credit reference agencies will have to implement the stringent AI Act requirements (eg conducting a risk assessment and verifying the quality of test data) when putting new AI systems on the market or making substantial changes to existing models.

An expanded scope?

However, the upcoming AI Act’s scope is not yet fixed: financial institutions cannot currently be certain they will not be affected. New use cases may be added to the “high-risk” category during the legislative process. And the Commission will likely be given the delegated power to extend the respective list after the AI Act has entered into force. The only pre-requisites for such extension are (i) a potential threat to fundamental rights of the relevant AI system and (ii) a certain similarity to systems which are already regulated. These pre-requisites leave the Commission with a considerable margin of discretion.

It cannot be ruled out that the Commission will make use of its delegated power to the detriment of the financial sector, for example by including algorithmic or high-frequency trading in the “high-risk” classification. In that respect, it should be noted that the draft AI Act’s definition of artificial intelligence is broad enough to cover a wide range of technologies, ranging from state-of-the-art deep learning systems, to traditional statistical approaches and simple Bayesian estimation.

AI Act: best practice

Irrespective of whether a specific AI system falls under the scope of the draft AI Act, financial institutions should consider implementing at least some of the AI Acts requirements if they use AI systems for the processing of personal data. European data protection authorities will likely use the AI Act as a yardstick when assessing AI systems and/or automated decision-making in general. Therefore, even in cases where the AI Act is not directly applicable, it may be indirectly applicable as a “best practice”.

Tags

ai, financial institutions, data, fintech, europe, eu digital strategy, eu ai act