This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

Freshfields TQ

Technology quotient - the ability of an individual, team or organization to harness the power of technology

| 4 minutes read

EU AI Act unpacked #9: Who are the regulators to enforce the AI Act?

While we have already explored different obligations that apply under the EU AI Act (AI Act) in previous posts of our blog series, we will now take a deeper dive regarding the question of which regulators are responsible for the enforcement of these obligations and how they are supposed to collaborate.

Enforcement at EU level

In general, the AI Act is enforced at EU level by the European Commission mainly in form of its so-called AI Office with regard to general-purpose AI (GPAI) and, otherwise, at the level of the EU Member States. Other institutions are also involved in the enforcement of the AI Act, such as the European Artificial Intelligence Board (AI Board), the scientific panel established under the AI Act and, with regard to institutions and bodies of the European Union falling in scope of the AI Act regarding their use of AI, the European Data Protection Supervisor.

Role and setup of the AI Office

The AI Office, established within the European Commission’s Directorate-General for Communications Networks, Content and Technology (DG CNECT), is supposed to not only play a major role in the enforcement of the AI Act, but also with regard to promoting an innovative ecosystem of trustworthy AI by ensuring a coherent European approach on AI at an international level. 

Regarding the enforcement of the AI Act, the European Commission, acting through its AI Office, has exclusive powers to supervise compliance with, and enforce, the provisions that apply to GPAI (including in respect of sanctions for non-compliance). It will also help with the coordination and cooperation of enforcement activity at Member State level, in particular with regard to joint investigations that may be carried out between (or together with) national market surveillance authorities of different Member States in cross-border scenarios (Article 74(11) AI Act). In certain cases, it can also overrule decisions of national market surveillance authorities, where a market surveillance authority of another Member State objects to the measure, or if it considers the measure taken by the national authority to be contrary to Union law (‘Union safeguard procedure’).

With the recently unveiled structure of the AI Office, the European Commission aims to ensure continuity and coherence on the implementation of the AI Act, with several staff from the existing DG CNECT AI Directorate having been deployed to the AI Office. In fact, the AI Office is headed by Lucilla Sioli, former DG CNECT Director for AI, who will work under the guidance of a lead scientific adviser who still needs to be appointed and an adviser for international affairs to represent the AI Office in global conversations on convergence towards common approaches. The latter position is held by Juha Heikkilä, former DG CNECT Advisor for International Affairs. 

Within the AI Office, five different units have been set up covering the following key aspects: (i) compliance, (ii) safety, (iii) excellence, (iv) international engagement and (v) innovation. It is worth flagging that key EU policymakers who have been involved in the negotiations of the AI Act are heading some of these units, such as Kilian Gross, who will be in charge of ensuring uniform application and enforcement as well as contributing to investigations and possible infringements. A dedicated unit focused on the risks of very capable GPAI models and in charge of developing the codes of conduct required under the AI Act will be headed by one of the lead negotiators from the European Parliament, Dragos Tudorache. 

Enforcement at EU Member State level

Apart from the AI Office with its exclusive power regarding GPAI rules, each EU Member State has to appoint or establish at least one market surveillance authority and one notifying authority in order to supervise the application and implementation of the AI Act (Article 70(1) AI Act) except for the GPAI-related provisions (see above). Between these types of authorities, the market surveillance authority in particular is responsible for enforcing the AI Act, while the notifying authority carries out conformity assessment-related tasks. 

The market surveillance authorities have extensive competences in the areas of surveillance, investigation and enforcement (see Article 74 AI Act in conjunction with Article 14 Market Surveillance Regulation (2019/1020/EU)). This includes, inter alia, the power to require providers, deployers and importers of AI systems to provide relevant data for assessing the AI system’s compliance (including, under certain conditions, access to the source code of high-risk AI systems) and to require them to take appropriate action to bring instances of non-compliance to an end or, where the latter fails, to take appropriate corrective action itself (including the power to prohibit or restrict the making-available of an AI system).

EU Member States will still have to enact implementing rules on penalties and other enforcement measures (which may also include warnings and non-monetary measures) within the framework provided by the AI Act and shall take all measures necessary to ensure that those rules are properly and effectively implemented.

AI Board and scientific panel

Not to be confused with the AI Office, the AI Board, composed of one representative per Member State, is supposed to advise and assist the European Commission/the AI Office and the Member States in order to facilitate the consistent and effective application of the AI Act. With regard to the enforcement of the AI Act, this applies in particular to the enforcement of the rules on GPAI. To ensure the involvement of the AI Board, the AI Act provides for various information, notification and consultation obligations between the AI Office and/or the national market surveillance authorities and the AI Board (see, for example, Articles 75(2) and 90(2) AI Act). 

Similar to the AI Board, a scientific panel of independent experts which will be established under the AI Act also has a supporting function vis-à-vis the AI Office and the national market surveillance authorities regarding the enforcement of the activities under the AI Act and can, in particular, issue so-called ‘qualified alerts’ to the AI Office in case of identified systemic risks in GPAI models.

What’s next?

After we have taken a closer look here at the AI Office and (other) key actors involved in the enforcement of the AI Act, we will explore one of the most well-known standards already existing in relation to AI: ISO 42001 on the AI management system, in the next post of our blog series.

Tags

ai, eu ai act, eu ai act series