AI Regulatory Compliance Checklist
A cross-regulation compliance checklist covering GDPR, EU AI Act, NIS2, DORA, and key standards for organizations deploying AI systems in Europe.
Organizations deploying AI in the EU face overlapping regulatory requirements. This checklist maps common obligations across GDPR, the EU AI Act, NIS2, and DORA to help compliance teams identify gaps.
Governance and Accountability
- Designated responsible person for AI compliance (EU AI Act, GDPR)
- Management body oversight of AI risk (DORA, NIS2)
- AI policy approved by senior management (ISO 42001, NIST AI RMF)
- Data Protection Officer appointed where required (GDPR)
- Documented roles and responsibilities for AI systems (all regulations)
- Regular management reporting on AI risk posture (DORA, NIS2)
- Staff training on AI-specific regulatory obligations (NIS2, DORA)
Risk Assessment
- AI system risk classification completed (EU AI Act)
- Data Protection Impact Assessment for personal data processing (GDPR)
- Fundamental rights impact assessment for high-risk AI (EU AI Act)
- ICT risk assessment including AI components (NIS2, DORA)
- Supply chain risk assessment for AI providers (NIS2, DORA)
- Bias and fairness assessment (EU AI Act, GDPR)
Technical Documentation
- Technical documentation per EU AI Act Annex IV (EU AI Act)
- Records of processing activities (GDPR)
- ICT asset inventory including AI systems (DORA, NIS2)
- Training data documentation and provenance (EU AI Act, GDPR)
- Model performance metrics and evaluation results (EU AI Act)
- System architecture and data flow documentation (all regulations)
Data Protection
- Lawful basis established for each processing activity (GDPR)
- Data minimization implemented in training and inference (GDPR)
- Data subject rights processes operational (GDPR)
- Cross-border transfer mechanisms in place (GDPR)
- Data retention policies defined and enforced (GDPR)
- Special category data handling safeguards (GDPR)
Security
- Encryption at rest and in transit (NIS2, DORA, GDPR)
- Access control and authentication for AI systems (NIS2, DORA)
- Vulnerability management for AI infrastructure (NIS2, DORA)
- Adversarial robustness testing (EU AI Act)
- Network security for AI endpoints (NIS2)
- Regular security testing including AI components (DORA, NIS2)
Transparency and Explainability
- Users informed when interacting with AI (EU AI Act)
- Meaningful information about automated decision logic (GDPR)
- AI system registration in EU database (EU AI Act, high-risk)
- Deployer notification with instructions for use (EU AI Act)
- Explanation mechanisms for individual decisions (GDPR)
Incident Management
- AI incident detection and response procedures (NIS2, DORA)
- Data breach notification within 72 hours (GDPR)
- Significant ICT incident reporting within 24 hours (NIS2, DORA)
- Serious incident reporting for high-risk AI (EU AI Act)
- Post-incident review and improvement process (DORA)
Third-Party Management
- Data processing agreements with all processors (GDPR)
- ICT third-party risk register (DORA)
- Security requirements in AI vendor contracts (NIS2, DORA)
- Exit strategies for critical AI providers (DORA)
- Sub-processor authorization and monitoring (GDPR)
Ongoing Compliance
- Post-market monitoring system for high-risk AI (EU AI Act)
- Quality management system (EU AI Act, ISO 42001)
- Regular DPIA reviews (GDPR)
- Continuous security posture monitoring (NIS2, DORA)
- Model performance monitoring and drift detection (EU AI Act)
- Annual compliance audit (recommended for all regulations)
Conformity and Certification
- Conformity assessment completed for high-risk AI (EU AI Act)
- CE marking affixed where required (EU AI Act)
- Declaration of conformity maintained (EU AI Act)
- Consider ISO 42001 certification (voluntary, supports compliance)
- Consider ISO 27001 certification (supports NIS2, DORA)
Need help implementing this?
Turn this knowledge into a working prototype. Our structured workshop methodology takes you from idea to deployed AI solution in three sessions.
Explore AI Workshops