Businessman using laptop with quality assurance and document symbol
Dilok / stock.adobe.com
2026-02-11 expert contribution

Conformity assessment of high‑risk AI systems under the EU AI Act: what providers need to know

The EU Artificial Intelligence Act (AI Act) establishes a comprehensive regulatory framework for high‑risk AI systems. Art. 43 AI Act outlines the mandatory conformity assessment procedures required before high‑risk AI systems may be placed on the EU market or put into service.  
This article provides a structured overview of the conformity assessment requirements, the systems in scope, and the applicable procedures under Art. 43 AI Act.  
 

Contact
AI Projects & Services

What is a conformity assessment of an AI system?

A conformity assessment under the AI Act is the formal process of demonstrating that a high‑risk AI system complies with the mandatory requirements in Chapter III, Section 2 of the Regulation (Art 3 (20) AI Act).  

Key elements assessed in the conformity assessment: 

  • Quality management system (QMS): Verification that the QMS is compliant with the requirements of Art. 17 AI Act and is effectively implemented. 
  • Risk Management System: Verification that risks are systematically identified, analyzed, mitigated, and monitored across the AI system lifecycle. 
  • Data Governance: Assessment of the quality, representativeness, bias mitigation, and suitability of training, validation and testing data. 
  • Technical Documentation: Evaluation of whether documentation is complete, up‑to‑date, and sufficient for assessing compliance according to Annex IV AI Act. 
  • Logging and Traceability: Ensuring that the system automatically records events throughout its lifecycle. 
  • Transparency & Human Oversight Measures: Assessing design features that enable appropriate system understanding and oversight. 
  • Accuracy, Robustness, Cybersecurity: Confirming that the AI system meets performance and security requirements throughout its lifecycle. 
  • Verification of the application of harmonized standards or common specifications to demonstrate conformity 

Which AI systems must undergo conformity assessments? 

According to Art. 43 (1-2) a conformity assessment is required for high‑risk AI systems, as defined in Annex III. 

Pursuant to the harmonization legislation mentioned in Part A of Annex I of the AI Act, providers of high-risk AI systems, such as medical devices or machinery incorporating an AI component, must follow the relevant EU harmonization legislation and ensure that the AI-specific requirements are included in that procedure (Art. 43 (3) AI Act). Notified bodies under the respective sectoral legislation may also assess these requirements. 

By means of a delegated act, the European Commission can amend the AI Act so that high-risk AI systems from other critical areas must also undergo a third-party conformity assessment if no harmonized standards or common specifications are applied (Art. 43 (6) AI Act). 

Substantial modifications 

Whenever a high‑risk AI system undergoes a substantial modification according to Art. 3 (23), it must undergo a new conformity assessment, except where the modifications are pre‑determined and documented by the provider (Art. 43 (4)). 

What Conformity Assessment Procedures Are Foreseen in the AI Act? 

Art. 43 provides two main procedures, depending on whether harmonized standards or common specifications were applied. 

Procedure A – Internal Control (Annex VI) 

Applicable when the provider of biometric AI systems (Annex III, No. 1) applied relevant harmonized standards (Art. 40) or common specifications (Art. 41) and, in general, for AI systems mentioned in Annex III, No. 2-8. 

The internal control does not involve a Notified Body but the following tasks: 

  • Performing internal checks and documentation reviews 
  • Verifying compliance with all high‑risk requirements 
  • Maintaining complete technical documentation 
  • Ensuring lifecycle‑wide quality management processes 

Procedure B – Third‑Party Assessment (Annex VII) 

Required when relevant harmonized standards or common specifications are not available for the application by provider of AI systems mentioned in Annex III, No.1. 
This procedure involves: 

  • Assessment of the provider’s quality management system 
  • Assessment of the technical documentation 
  • A detailed evaluation by a Notified Body 

Role of prEN 18285 

The forthcoming standard prEN 18285 AI Conformity Assessment Framework supports this by providing a structured framework for assessing not only technical documentation but also QMS maturity, process controls, and lifecycle compliance mechanisms. It helps Notified Bodies operationalize the Annex VII requirements and achieve consistency across assessments. It includes: 

  • Assessment phases and checkpoints 
  • Required evidence types for each AI Act requirement 
  • Roles and responsibilities between provider and assessment body 
  • Guidance for Notified Bodies on audit depth and sampling 
  • Templates for conformity assessment reports and documentation reviews 

Conclusion 

The conformity assessment mechanisms under Art. 43 AI Act establish a robust assurance framework for high‑risk AI systems. Providers must carefully determine which procedure applies to their system, implement all necessary technical and organizational safeguards, and prepare comprehensive documentation aligned with the AI Act and prEN 18285. 

Get in touch with us!

Briefumschlaege als Icons, Netzwerkkonzept
thodonal / stock.adobe.com

We offer our services in consulting projects and in-house workshops and would be happy to provide you with more information about our services and answer any questions.

AI Projects & Services: aips@vde.com