[You can find all episodes of our EU AI Act unpacked blog series by clicking here.]
This is the third part of a three-part series. Please follow the links for parts one and two.
Besides the indirect impact of the EU AI Act in shaping regulatory approaches to AI in Asia, the Act may apply directly to businesses outside the EU in certain scenarios through its extra territorial provisions.
Direct extra-territorial application
The EU AI Act will be directly applicable to entities outside of the EU that:
- place their AI systems or GPAI models on the EU market under their own trade mark (including AI systems that are integrated into their other products)
- put AI systems into service under their own trade mark in the EU (including AI systems that are integrated into their other products, and including when put into service by a third party deploying entity in the EU)
- provide or deploy AI systems outside of the EU where the output (ie, predictions, content, recommendations or decisions, etc) generated by the system is intended to be used in the EU. The EU Commission is expected to clarify this provision in guidance due in February 2025.
In these scenarios, the non-EU based entity will be subject to the same obligations as a ‘provider’ of the system, or, where the jurisdiction of the Act is based on the use of an AI output in the EU, the obligations of a ‘deployer’. These terms and the related obligations are described in this earlier blog in the series.
Non-EU providers of ‘high-risk’ AI systems must appoint an authorised representative who is established in the EU (including where the system is placed on the EU market through an importer). The representative will be the point of contact for EU authorities and also be responsible for the compliance of a ‘high-risk’ AI system with the Act.
For example, the authorised representative will need to verify the necessary technical documentation demonstrating the system’s compliance with the Act (and to be used as the basis for the conformity assessment) and ensure that the ‘high-risk’ system and the system provider are duly registered with EU authorities before being placed on the market.
Importers
EU-based importers of ‘high-risk’ AI systems placed onto the EU market (ie, importing entities that are themselves established or located in the EU) have personal obligations under the EU AI Act to ensure that those systems comply.
Among other things, importers are required to ensure that providers have carried out a conformity assessment, prepared the necessary technical documentation and appointed an authorised representative. In addition, importers must ensure that a ‘high-risk’ system bears the required CE marking and is accompanied by a declaration of conformity and usage instructions. Importers should additionally be aware of the 10-year document retention requirement: applicable to the certificate issued by a conformity assessment body (where a third-party conformity assessment is required), the usage instructions, and the EU declaration of conformity. This documentation will need to be produced to EU authorities on request.
The importer’s details will also need to be recorded on packaging and user instructions for those ‘high-risk’ AI systems.
Distributors
An entity that makes available in the EU (ie, re-sells) an AI system that has already been imported into and placed on the market in the EU (either directly by the provider or through an importer) will be designated a distributor of that system. Accordingly, non-EU entities could either constitute distributors themselves or be appointing distributors that have obligations under the Act.
Those obligations of distributors likewise only apply to ‘high-risk’ AI systems. A distributor is required to ensure, among other things, that the system is properly labelled (ie, to identify the provider of the system and any importer, as well as CE marking), and is accompanied by a declaration of conformity and the necessary usage instructions. Distributors are also responsible for confirming that the provider of a ‘high-risk’ system has set up the required quality management system.
Examples
- If a vehicle OEM based outside the EU were to sell vehicles on the EU market under its own trade mark that incorporate an AI system also sold under the OEM’s own brand, the OEM would be considered to be the provider of the system. Accordingly, it would have to comply with all of the requirements of the EU AI Act applicable to a provider, ie, even if the AI system or model is in-sourced or licensed from a third party. The importer of the vehicle may separately be subject to obligations under the Act if the AI system qualifies as a ‘high-risk’ AI system, as outlined above.
- In the scenario outlined above, an Asian-based provider of an AI system incorporated into the vehicle sold by the importer on the EU market would constitute a provider in its own right if the AI system carries (or is marketed under) that companies’ brand, and not the brand of the vehicle OEM. Conversely, if the AI system were to be sold only under the OEM’s brand, the OEM alone would assume the role of the provider and be solely responsible for ensuring compliance with the Act.
- If an Asia-based online shopping platform were to resell an AI system on the EU market under the provider’s trade mark, its EU affiliate or other importer or record might constitute the importer of the system (potentially depending on how exactly the sale and delivery to the EU-based customer is structured).
- If an Asia-based recruitment company uses AI to analyse candidates and prepare a report for a customer based in the EU, that report would be considered output used in the EU. The recruitment company is likely to be subject to the EU AI Act since the output was intended to be used in the EU (ie, by the customer there).
Importers and distributors ought to carry out specific compliance checks to ensure they do not make non-compliant ‘high-risk’ AI systems available in the EU. Additionally, both, importers and distributors will be required to cooperate with authorities to facilitate effective market surveillance (who may carry out risk-based checks on the characteristics of AI systems by means of documentary checks or physical or laboratory tests).
Importers and distributors will naturally seek contractual protections from providers of AI systems, even where that provider is not itself subject to the requirements of the EU AI Act. However, any contractual arrangements between the parties will not formally shift the allocation of responsibilities under the Act, except to the extent that the parties agree which party’s trade mark the system will be sold under.
Lastly, providers of GPAI models will have to consider EU copyright law even where the training of the AI model is conducted outside of the EU. GPAI model providers are required to adopt a policy to comply with EU copyright law, in particular to identify and comply with opt-outs declared by rightsholders in respect of text and data mining.
For further information, see our earlier briefing on the personal and territorial scope of the EU AI Act.