Navigating the EU AI Act
A Compliance FAQ for Medical Device Manufacturers
The EU AI Act introduces sweeping new requirements for artificial intelligence across industries, including healthcare. For medical device manufacturers, this regulation impacts how AI-driven devices and software are designed, validated, and placed on the EU market. Understanding the Act’s scope, timelines, and compliance obligations is critical to maintaining EU MDR / IVDR compliance and ensuring continued market access.
Below, we answer some of the most common questions medical device companies have about the EU AI Act and its impact on regulatory strategy.
Q1. What is the AI Act, and why is it important for medical device companies?
The EU AI Act (Regulation EU 2024/1689) is a new regulation, formally Regulation EU 2024/1689, which was approved on March 13, 2024. It is a binding legislative act applicable in all EU countries, sharing the same legal instrument and standing as the EU Medical Device Regulation (MDR) and In Vitro Diagnostic Regulation (IVDR).
Its importance lies in its core objectives:
-
Ensuring the safety of AI systems and their respect for fundamental rights.
-
Promoting AI innovation and uptake within the EU.
-
Creating legal uncertainty to facilitate investment and innovation in AI.
-
Creating legal certainty to facilitate investment and innovation in AI.
-
Addressing risks associated with specific uses of AI, particularly those posing high risks to fundamental rights.
Q2: When do we need to comply with the EU AI Act?
For regulated digital medical products subject to the EU MDR/IVDR (Annex II), the obligations for their AI systems will apply 36 months after the Act's entry into force.
In general, the obligations for high-risk systems that are a safety component of a product, or the AI is the product itself, and which require a third-party conformity assessment under existing EU laws, will also apply 36 months after entry into force.
Q3: What are four major areas manufacturers need to address for compliance?

Risk
Risk management is fundamental, requiring identification, evaluation, and mitigation of 'reasonably foreseeable risks' that high-risk AI systems may pose to health, safety, or fundamental rights. Determining the risk classification of the device's AI component is key, as certain risk levels can render a system prohibited.

Data Governance
This extends GDRP alignment with AI-specific requirements for training, validation, and testing data sets. This includes assessing data suitability, examining potential biases, and considering the specific contextual settings in which the data will be used.

Providers & Developers
Compliance responsibility is shared. Providers, in addition to developers, must ensure high-risk systems are compliant. A specific requirement is that providers must implement an AI quality management system.

Transparency & Oversight
This involves requirements like labeling for the intended user and other parties. Transparency must be clearly defined to mitigate the risk of unintended use. Oversight by a competent individual is also needed as the models develop over time and require input, retraining, and potential alteration.
Q4. How does the AI Act's risk classification relate to MDR/IVDR risk management?
Risk management under the EU AI Act is interrelated with the EU MDR risk management system. It is used to identify, evaluate, and mitigate the 'reasonably foreseeable risks' that high-risk AI systems can pose to health, safety, or fundamental rights.
The determination of the AI system's risk classification is crucial, as it dictates the level of documentation required. The classification determines if a system is prohibited (highest risk), high-risk (highest level of scrutiny/documentation), or associated with a systemic risk (less strenuous documentation).
Q5. What is the recommended strategy for planning and implementation?
In efforts to best prepare for compliance with the EU AI Act, we recommend that medical device companies:
-
Develop a formal quality plan to update their QMS and regulatory documentation.
-
Conduct a gap analysis against AI Act requirements.
-
Prepare for remediation activities, including updates to design files, labeling, and risk management.
-
Train cross-functional teams to ensure AI Act requirements are embedded into ongoing operations.
Supporting Your Compliance Journey
The EU AI Act will fundamentally change how AI in medical devices is regulated across Europe. Companies that plan early and integrate compliance into their existing MDR/IVDR processes will gain a competitive advantage in the market.
Rook Quality Systems helps medical device manufacturers prepare for the EU AI Act by providing:
- Quality planning and strategy development
- Gap analysis reviews
- Redlines and documentation updates
- Remediation support for QMS and technical files
With expert guidance, your team can confidently navigate this regulatory transition and maintain access to the EU market. Click the button below to get started.