Technologie-, und Innovationskonzept
peshkova / stock.adobe.com
2025-03-12 expert contribution

AI Act Compliance Project

Bringing AI-based medical devices to market safely and efficiently with the right strategy.

Contact
Dr. Thorsten Prinz

Regulation (EU) 2024/1689 for artificial intelligence (Artificial Intelligence Act, AI Act) came into force on August 1, 2024, and is gradually applicable. Medical device manufacturers must comply with the requirements of the AI Act if they wish to place an AI system (as defined in Art. 3 (1) AI Act) on the market in the EU. 

If the AI system is a stand-alone product or the safety component of a product within the scope of the European Regulations 2017/745 on medical devices (MDR) and 2017/746 on in vitro diagnostic medical devices (IVDR), which must also undergo a conformity assessment procedure by a third party, it is classified as a high-risk AI system in accordance with Art. 6 (1) AI Act. 

The comprehensive obligations of providers of high-risk AI systems are summarized in Art. 16 AI Act, which are to be applied from August 2, 2027, for high-risk AI systems according to Annex I AI Act. Medical device manufacturers are therefore well advised to start an AI Act compliance project as early as possible. 

8 steps towards a successful AI Act compliance project 

The necessary steps, some of which can be carried out in parallel, are explained below. 

1. Establish a cross-organizational project team 

The project team should consist of at least representatives from Regulatory Affairs, Quality Management, Legal, Development / Production as well as Management. The responsibilities and tasks of the project members and, if necessary, of external service providers are assigned. All other activities (e.g. time and milestone planning, resource planning and risk management) are based on proven project management procedures. 

2. Thoroughly understanding the AI Act 

Regarding medical devices, the key provisions and definitions of the AI Act are identified and interpreted. The results are summarized in an internal guideline document. 

3. Classify AI-based medical devices in relation to the AI Act 

Based on the intended purpose, it is determined whether the medical device is an AI system as defined and whether it is a stand-alone product or the safety component of a product. Finally, the AI-based medical device is assigned to one of the AI Act risk categories to be discussed with the responsible notified body. Most AI-based medical devices in MDR risk classes IIa-III and in-vitro diagnostics (IVDs) in IVDR risk classes B-D are likely to be assigned to high-risk AI systems. 

4. Perform a gap analysis regarding the technical documentation and the quality management system 

The existing technical documentation and the established quality management system are subjected to a gap analysis based on the AI Act requirements for the respective risk category. 

The obligations of providers of high-risk AI systems relate both to new requirements and to requirements that are already known from the MDR and IVDR (cf.  Challenges for AI-based medical devices due to the EU Artificial Intelligence Act). 

The identification of AI-specific standards, guidelines and best practices is crucial for the implementation of the new requirements (cf.  Approval of AI-based medical devices in Europe). 

5. Further develop technical documentation and the quality management system 

The results of the gap analysis are translated into specific new processes or adjustments to existing processes regarding the quality management system. This can be done by integrating them into the established quality management system (Art. 17 (3) AI Act). At present, it also makes sense to consider the requirements of the ISO/IEC 42001 standard. 

The existing technical documentation is also expanded to include AI-specific parts. 

In some sections, the AI Act expressly requires a compliance-by-design approach (Art. 13-15 AI Act), i.e. the regulatory requirements should be taken into account at an early stage in the development of the AI system. 

6. Perform internal training 

The new contents of the quality management system and the technical documentation are trained internally. Furthermore, adequate AI literacy according to Art. 4 AI Act is built up. 

7.Prepare and carry out AI Act conformity assessment 

According to Art. 43 (3), AI-based medical devices and IVDs are subject to conformity assessment by a notified body in accordance with the MDR and IVDR regulations. The structured dialog with the Notified Body on the AI-specific further development of quality management and technical documentation prepares the actual AI Act conformity assessment procedure. 

8. Monitoring the regulatory development of the AI Act 

In accordance with Art. 96 AI Act, the Commission draws up guidelines for the practical implementation of the AI Act and Art. 41 AI Act authorizes the Commission to define common specifications. In addition, the European standardization organizations are supposed to provide new standards to meet individual AI Act requirements by 2026. Adjustments to the text of the regulation are also possible. The regulatory development of the AI Act must therefore be continuously monitored. 

What support we offer 

Please get in touch if you need support with the implementation of your AI Act compliance project. We offer you an initial non-binding discussion free of charge.