With a medical purpose, artificial intelligence (AI) systems in medicine are subject to regulation by the European Medical Device Regulations (MDR) 2017/745 and In Vitro Diagnostic Device Regulations (IVDR) 2017/746 in the same way as classical software. Furthermore, with regard to the lifecycle of medical AI systems, the application of regulatory processes in accordance with established standards for medical technology is recommended. In this blog post, we will take a closer look at what other requirements currently apply to these products and where manufacturers can find compliance assistance. The focus here is on medical devices within the scope of the MDR.
Current legal and normative requirements
As with any software, medical AI systems must first be clarified as to whether they are a medical device within the meaning of the MDR. This so-called "qualification as a medical device" is carried out in relation to the intended purpose of the respective medical AI system according to the criteria as described in the document MDCG 2019-11. The second step is the risk classification, which is also carried out according to the procedure in the document MDCG 2019-11. With respect to software, the application of Rule 11 in Annex VIII MDR is particularly relevant here. The website "AI for Radiology" lists numerous examples of medical AI systems for the application field "Radiology" with corresponding information on risk classification according to MDR and FDA, certificates as well as product specifications. In principle, manufacturers of medical AI systems are subject to the requirements of Art. 10 MDR like other medical technology manufacturers. In this blog post, the previously mentioned points for software as a medical device are once again summarized in detail.
Specific requirements for software as a medical device can be found in the MDR in sections 14.2(d), 14.5, and 17.1 to 17.4 of Annex I, Chapter II, and these are also relevant for medical AI systems. In terms of regulatory processes during the software lifecycle, the EN 62304 standard is applied to medical AI systems. For stand-alone AI-based software that is not part of a medical device, the EN 82304-1 standard is used as a supplement. Neither the MDR nor the aforementioned standards contain AI-specific requirements with regard to safety and performance.
In order to provide manufacturers with guidance regarding the question of which AI-specific content should be included in the manufacturer's technical documentation, the Association of Notified Bodies for Medical Devices in Germany (IG-NB) published the document "Questionnaire Artificial Intelligence (AI) in medical devices". This contains numerous questions for the manufacturer regarding responsibilities, competencies, intended purpose, software requirements, data management, AI model development, product development and post-market surveillance. The document "Good practices for health applications of machine learning: Considerations for manufacturers and regulators" from the International Telecommunication Union (ITU) is very similar to the IGNB document and can be used to explain individual requirements.
Insofar as personal data is processed during the development and application of AI systems, the European Data Protection Regulation (GDPR) applies. It also has a significant impact on AI systems. A few important aspects are discussed below. Manufacturers and users must apply the data processing principles set out in Art. 5 GDPR. Furthermore, the manufacturer must implement technical and organizational measures for data protection already in the development phase (Art. 25 GDPR). Since Art. 22 GDPR essentially prohibits automated decision making including profiling, manufacturers of autonomous AI systems must take the measures required by the law. For new technologies such as AI that pose a high risk to the rights and freedoms of natural persons, a data protection impact assessment is required as a precautionary measure (Art. 35 (1) GDPR). A detailed legal discussion on the topic was published by Schreitmüller. First and foremost, the international standards organizations International Organization for Standardization (ISO) and International Electrotechnical Commission (IEC), as well as the Association for the Advancement of Medical Instrumentation (AAMI) and the Institute of Electrical and Electronics Engineers (IEEE) have published standards related to AI technologies. Standards Australia has produced a good overview of ISO/IEC standards. Even though at this stage most of the documents contain concrete product requirements and do not relate to medical devices, manufacturers can find important information in them with regard to the creation of regulatory processes.
CE conformity assessment of AI systems
IG-NB comments in Section A of its document on the certifiability of continuous learning AI systems as follows: "Static AI (AI that has learned and operates in a learned state) is in principle certifiable. Dynamic AI (AI that continues to learn in the field) is not certifiable in principle, as the system must be verified and validated (among other things, the functionality must be validated against the intended use)." Thus, according to IG-NB, there must be a defined state of development at the time of verification and validation activities. Accordingly, continuous-learning AI systems in medicine that do not have a defined technical development status are generally not certifiable.
However, even for static AI systems (non-continuous learning) with a black box behavior (the system does not explain how it arrived at a certain result), a successful CE conformity assessment cannot be performed in individual cases, as IG-NB also states in Section A of its document. In this regard, it refers to Articles 22 and 35 of the GDPR, Sections 17.2 Annex I MDR and 16.2 Annex I IVDR, and document MDCG 2020-1.
Future European Artificial Intelligence Act (AIA)
To create a uniform horizontal legal framework for the development, marketing and use of artificial intelligence, the EU Commission published a draft for the Artificial Intelligence Act (AIA) on April 21, 2021. Medical devices will fall within the scope of the AIA as so-called high-risk products. High-risk products must meet the requirements of Articles 8 to 15 from Chapter 2 "Requirements for high-risk AI systems." These include, for example, the obligations to establish a risk management system and to prepare technical documentation, which manufacturers already know from the MDR. But the AIA draft also contains new AI-specific requirements with regard to the data used or human supervision. In Chapter 3, "Obligations of providers and users of high-risk AI systems and other parties," Articles 16 to 29 formulate both requirements known from the MDR, such as the establishment of a quality management system, and new requirements, such as the obligation to retain automatic logs. The legislative process is currently ongoing. In the meantime, numerous changes have been made to the original draft, and further changes cannot be ruled out.
Regulatory approach BAIM to meet regulatory requirements
VDE has developed the regulatory approach "BAIM - Boost AI to Market" to help manufacturers meet complex legal and regulatory requirements. BAIM essentially adds AI-specific aspects to existing processes at the manufacturer on software lifecycle, risk management, usability engineering, clinical evaluation/follow-up, and post-market surveillance/vigilance
As the technical and regulatory state-of-the-art in medical AI systems is continuously changing, manufacturers need to monitor the changes and integrate them into their regulatory processes in a timely manner. In our numerous events and blog posts, we make sure you don't miss any significant regulatory innovations!