EU AI Act Article 17 QMS template (free download)
A quality management system template for high-risk AI systems under EU AI Act Article 17 — 13 mandatory elements, mapping to ISO 9001, lawful-basis notes.
- EU AI Act Art. 17
- ISO 9001
The EU Artificial Intelligence Act (Regulation (EU) 2024/1689) imposes a quality management system obligation on providers of high-risk AI systems under Article 17. The QMS provisions, alongside the bulk of the high-risk system rules, become applicable on 2 August 2026, see Article 113(c) for the staged application timetable. This template walks you through the 13 mandatory elements, the mapping to ISO 9001:2015, and the documented information you need to produce.
Who must implement Article 17
Article 17 applies to providers of high-risk AI systems within the meaning of Article 6 and Annex III, with limited transitional provisions. “Provider” means the natural or legal person that develops or has an AI system developed and places it on the market or puts it into service under its own name or trade mark, regardless of whether for payment or free of charge.
If you are an importer or distributor, your obligations are elsewhere in Chapter III. Article 17 itself binds the provider.
If you are a deployer (formerly user under earlier drafts), you are governed by Article 26, separate obligations, not Article 17.
The 13 mandatory elements
Article 17(1) sets out the elements your QMS must include. We summarise each, indicate the ISO 9001 mapping, and list the typical documented information.
1. Strategy for regulatory compliance
A documented strategy demonstrating how the provider achieves and maintains compliance with the AI Act, including conformity assessment procedures and management of substantial modifications. Maps to ISO 9001 clauses 4 (context), 5.1 (leadership), and 6.1 (risk and opportunity).
2. Techniques, procedures and systematic actions for design
Documented procedures for design, design control, design verification. Covers requirements engineering, design reviews, prototyping, internal sign-off. Maps to ISO 9001 clause 8.3 design and development.
3. Techniques, procedures and systematic actions for development
Procedures covering implementation, integration, configuration management, build pipelines, branch policies, code review, change control. Maps to ISO 9001 clauses 8.5 production and 7.5 documented information.
4. Quality control and quality assurance
Test strategy, test cases, sampling, acceptance criteria, defect classification. For AI systems specifically: data validation, model evaluation harness, fairness and robustness testing. Maps to ISO 9001 clauses 8.6 release and 9.1 monitoring and measurement.
5. Examination, test and validation procedures
Procedures applicable before, during, and after development, including the frequency with which they are run. Encompasses model evaluation (performance, safety, robustness), data drift detection, integration testing. Maps to ISO 9001 clauses 9.1 and 8.6.
6. Technical specifications including standards
Identification of the harmonised standards applied (when published in the Official Journal under Article 40), the technical specifications applied where harmonised standards are not available, the means used to ensure the AI system complies with Chapter III requirements. Maps to ISO 9001 clause 7.5 documented information and 8.4 externally provided.
7. Systems and procedures for data management
Data acquisition, collection, analysis, labelling, storage, filtration, mining, aggregation, retention. Article 10 data governance obligations flow through here, relevance, representativeness, accuracy, freedom from bias to the extent feasible. Maps to ISO 9001 clauses 7.5 and 8.5.4 preservation.
8. Risk management system
The risk management system referred to in Article 9, iterative, documented, and updated. Identifies foreseeable risks; estimates and evaluates risks during normal use and reasonably foreseeable misuse; adopts risk-management measures. Maps to ISO 9001 clause 6.1; ISO 14971 is a useful template if your AI system overlaps with medical devices.
9. Setting-up, implementation and maintenance of post-market monitoring
The post-market monitoring system per Article 72. Collects, documents, and analyses data on performance after placement on the market, to evaluate continued compliance. Maps to ISO 9001 clauses 9.1 monitoring and 9.3 management review.
10. Procedures related to reporting of serious incidents
Per Article 73, the provider must report serious incidents to the relevant national market surveillance authority. The QMS must cover the detection, evaluation, and reporting workflow. Maps to ISO 9001 clauses 8.7 nonconforming output and 10.2 corrective action.
11. Handling of communication with national authorities
Procedures for communication with competent authorities, notified bodies, other operators, customers, and end-users. Maps to ISO 9001 clause 7.4 communication.
12. Systems and procedures for record-keeping
All documentation and information required by Articles 11 (technical documentation), 18 (record-keeping), 50 (transparency obligations), and 72 (post-market monitoring). Maps to ISO 9001 clause 7.5.
13. Resource management, including security-of-supply measures
Resources for the AI system: people, infrastructure, suppliers, data suppliers, security of supply. Maps to ISO 9001 clauses 7.1 resources and 8.4 externally provided processes.
Mapping to ISO 9001:2015, at a glance
| Article 17 element | ISO 9001 clause | Notes |
|---|---|---|
| 1 Strategy | 4, 5.1, 6.1 | Your AI Act strategy is a context document. |
| 2 Design | 8.3 | Includes verification and validation. |
| 3 Development | 8.5, 7.5 | Pipeline + change control. |
| 4 Quality control | 8.6, 9.1 | Includes model evaluation. |
| 5 Examination, test, validation | 9.1, 8.6 | Frequency stated. |
| 6 Technical specifications | 7.5, 8.4 | Harmonised standards when published. |
| 7 Data management | 7.5, 8.5.4 | Article 10 data governance. |
| 8 Risk management | 6.1 | Article 9 specific. |
| 9 Post-market monitoring | 9.1, 9.3 | Article 72. |
| 10 Incident reporting | 8.7, 10.2 | Article 73. |
| 11 Authority communication | 7.4 | Authority points of contact. |
| 12 Record-keeping | 7.5 | Article 11, 18, 50, 72 records. |
| 13 Resources | 7.1, 8.4 | Includes security of supply. |
Lawful basis where personal data is processed
Article 17 does not itself create a lawful basis for processing personal data. Where your AI system processes personal data, at training, at inference, or in monitoring, the GDPR continues to apply. The most common lawful basis questions:
- Training data. Article 6(1) GDPR. Article 6(1)(f) (legitimate interest) is the most-tested basis for non-special-category training data; document the legitimate interest assessment. Article 9 special category data needs an explicit additional basis under Article 9(2).
- Inference at runtime. Often Article 6(1)(b) (contract) for the user being served. Other bases per context.
- Post-market monitoring with personal data. Compatibility with the original purpose under Article 6(4) GDPR; specific safeguards.
Records you must keep ready
- Quality management policy and objectives.
- Procedure documents covering each of the 13 elements.
- Technical documentation per Article 11 and Annex IV.
- Logs per Article 12.
- Risk management file per Article 9.
- Post-market monitoring plan and reports per Article 72.
- Conformity assessment evidence per Article 43 (or Annex VI / VII as applicable).
- EU declaration of conformity per Article 47.
- Authorised representative records (if non-EU provider).
Download the DOCX
The downloadable template ships with a section per Article 17 element, the ISO 9001 cross-reference matrix, the post-market monitoring plan skeleton, the risk management file outline, and a record register. Adapt to your AI system’s risk classification and use case.