This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

Freshfields TQ

Technology quotient - the ability of an individual, team or organization to harness the power of technology

| 4 minutes read

­­­­EU AI Act unpacked #10: ISO 42001 - a tool to achieve AI Act compliance?

In the tenth part of our EU AI Act unpacked blog series, we take a look at one of the most well-known standards already existing in relation to AI: ISO 42001 on the AI management system.

The EU’s Artificial Intelligence Act (AI Act) is expected to enter into force in August 2024. Once the (staggered) implementation period expires, providers of AI systems - especially those that are considered ‘high-risk AI systems’ - will have to comply with a wide range of legal requirements. One of the key instruments for implementing these requirements will be standards. This blog post explores whether and how ISO 42001 may be utilised to meet some of the key AI Act requirements.

In sum, while the ISO 42001 standard requires a number of measures similar to those prescribed by the AI Act, the latter is often more specific. While adhering to the ISO standard will often be helpful to ensure compliance with laws including the AI Act (and in some cases might actually suffice), this may lead competent authorities to consider that ISO-compliance does not equal AI Act-compliance, even to the extent there are overlapping requirements. 

1. Overlapping requirements, different focus 

The ISO standard differs from the AI Act conceptually, and in respect of legal quality: Whereas ISO 42001 is a non-binding standard focused on AI management and governance within an organisation, the AI Act is a binding regulation focused on product safety. Nonetheless, ISO 42001 can be a helpful starting point for approaching AI compliance.

Taking the practical example of record-keeping, the AI Act requires that high-risk AI systems allow for the automatic recording of events (logs) over their lifetime to ensure traceability of the functioning of the system. Similarly, under ISO 42001, organisations should ensure logging for AI systems they deploy to automatically collect and record event logs related to certain events that occur during operation. Under ISO 42001 this can include, among others, information on the time and date of the system’s use, which is also required under the AI Act for certain AI systems. 

However, the focus of the logging requirement is different under both instruments and compared to the AI Act, ISO 42001 is less detailed. In particular, the AI Act specifically requires logging capabilities to enable identifying situations that may result in the high-risk AI systems presenting a risk to health, safety and fundamental rights and facilitating post-market monitoring. The AI Act requirement is mandatory for high-risk AI systems throughout the lifetime of the system, whereas under ISO 42001, organisations may determine themselves at which phases of the AI system life cycle record keeping should be enabled, but at least when the system is in use. In addition, while both the AI Act and ISO 42001 require logs to be kept for as long as required for the intended use (or purpose) of the AI system, only the AI Act requires that logs should be kept for at least six months. 

The same holds true for other AI Act requirements such as the risk management systems, technical documentation, human oversight, or transparency obligations for which ISO 42001 also provides for some foundation, but further steps will need to be taken for compliance with the AI Act.

2. ISO 42001 does not cover everything

Several requirements of the AI Act are not addressed by ISO 42001. This concerns, among others, issues specific to European product safety law. For example, ISO 42001 does not require producing and retaining a signed EU Declaration of Conformity or affixing the CE marking of conformity in a clear and legible manner to a high-risk AI system. ISO 42001 does also not include specific rules for reporting to and cooperating with European authorities, such as the European Commission or national authorities. Most notably, however, only the AI Act prohibits certain AI practices, such as social scoring or emotion recognition at the workplace. In contrast, ISO 42001 merely requires determining whether such prohibitions under other regulations exist but does not prohibit certain AI practices itself.

3. Additional requirements under ISO 42001

Some provisions of ISO 42001 have no equivalent in the AI Act. These mostly concern broader governance aspects. For example, under ISO 42001, top management shall establish an AI policy that is appropriate to the purpose of the organisation and should demonstrate leadership and commitment with respect to the AI management system. 

4. Road ahead: Harmonised standards

Beyond ISO 42001, there are further (non-ISO) standards on the horizon. In May 2023, the European Commission requested the European Committee for Standardisation and the European Committee for Electrotechnical Standardisation to draft and publish several European standards by 30 April 2025. These standards are expected to give further guidance on AI Act obligations such as risk management systems, record keeping, or human oversight. If adopted by the European Commission, high-risk AI systems, which are in conformity with these standards, will be presumed to comply with the corresponding AI Act requirements, which will likely make them highly relevant for AI Act compliance. ISO 42001 will therefore not remain the only standard that can help to meet AI Act requirements and the emergence of other standards will need to be closely monitored.

5. Key takeaways

  • ISO 42001 is a non-binding and flexible standard, which organisations can adapt and modify according to their needs. This might limit its usefulness to achieve compliance with binding laws and requires caution and a case-by-case analysis when working with ISO 42001 to achieve AI Act compliance.
  • Regardless, ISO 42001 can provide a valuable starting point for companies to approach compliance with some of the requirements that are also part of the AI Act.
  • Several AI Act obligations are not covered under ISO 42001, such as conformity assessment or CE marking, reporting to and cooperation with authorities or prohibited AI practices and ISO 42001 provides no sufficient guidance in this regard.
  • ISO 42001 sets forth some broader governance requirements that have no equivalent under the AI Act, including that top management demonstrates leadership and commitment with respect to the AI management system.
  • Beyond the ISO framework, the EC requested several European standards to be drafted by 30 April 2025 on key requirements of the AI Act, which will provide further guidance on AI Act compliance and will need to be monitored. 

In our next blog, we will take a closer look at the sanctions that can be imposed under the AI Act. 

Tags

ai, eu ai act, eu ai act series