The GDPR and the EU AI Act were designed as complementary frameworks, but they were drafted by different teams over different years, and the interactions between them require careful navigation. For most AI systems that process personal data in the EU, both apply simultaneously.
The Relationship Between the Two Laws
The EU AI Act explicitly states that it does not replace the GDPR — both apply in parallel. Article 2(7) of the AI Act states that it “shall be without prejudice” to GDPR. The EU AI Act can impose additional obligations beyond what GDPR requires, but it cannot override GDPR protections.
In practice:
- Any AI system that processes personal data must comply with GDPR regardless of AI Act classification
- High-risk AI systems under the AI Act have additional obligations on top of GDPR
- GDPR’s automated decision-making article (Article 22) has significant overlap with AI Act high-risk requirements
GDPR Article 22: The Original AI Regulation
Article 22 of the GDPR is the original EU AI regulation. It gives individuals the right not to be subject to decisions based solely on automated processing (including profiling) that produce legal or similarly significant effects.
Article 22 applies when:
- A decision is made by automated means without human involvement, AND
- The decision produces legal effects or similarly significant effects on the individual
Obligations under Article 22:
- Inform individuals that automated decision-making occurs
- Provide meaningful information about the logic involved
- Allow individuals to request human review
- Allow individuals to contest the decision
Exceptions: Automated decisions are permitted (with safeguards) if based on contract performance, authorized by EU/member state law, or based on explicit consent.
The overlap with the AI Act is significant: high-risk AI systems in employment, credit, and benefits are exactly the type of systems Article 22 was designed to cover.
Where GDPR and the AI Act Overlap
Transparency Obligations
GDPR: Article 13/14 (information to be provided) + Article 22 (right to explanation for automated decisions)
AI Act: Article 13 (transparency and provision of information to deployers) + Article 14 (human oversight) + Articles on transparency to affected persons
Overlap: Both require telling people that AI is being used in decisions about them, and what the AI is doing. A comprehensive transparency disclosure can satisfy both.
Documentation Requirements
GDPR: Article 30 requires Records of Processing Activities (ROPA); Data Protection Impact Assessments (DPIA) required for high-risk processing
AI Act: Detailed technical documentation required for high-risk AI systems; logs must be maintained
Overlap: A DPIA and an AI Act impact assessment often cover similar ground. Companies doing DPIAs for AI processing are partially building their AI Act documentation.
Data Governance
GDPR: Data minimization, purpose limitation, accuracy, storage limitation
AI Act: Training data must be relevant, representative, error-free, and complete; data governance practices documented
Overlap: Good GDPR data governance supports AI Act data requirements. A system compliant with GDPR data quality principles is better positioned for AI Act data governance requirements.
Human Oversight
GDPR: Article 22 requires human review option for automated decisions
AI Act: Human oversight measures must be built into high-risk AI systems; operators must be able to override
Overlap: Both require that humans can intervene. The AI Act is more prescriptive about what “overview” must mean technically.
Where They Diverge
Technical documentation: The AI Act requires detailed technical documentation about the AI system itself (architecture, training methodology, testing) that goes far beyond what GDPR requires about processing activities.
Conformity assessment: The AI Act requires a conformity assessment before market entry for high-risk systems. GDPR has no equivalent pre-market clearance.
Prohibited practices: The AI Act outright bans certain AI applications. GDPR restricts (but doesn’t categorically ban) automated decision-making.
Scope: GDPR is limited to personal data. The AI Act covers AI systems broadly, including those that don’t process personal data (though most do).
Practical Integration Strategy
Don’t run parallel programs. A combined GDPR + AI Act compliance program is more efficient than treating them separately.
Start with your DPIA process. If you have high-risk AI processing, you likely need both a DPIA and an AI Act risk assessment. Design your DPIA template to capture the information needed for both.
Map your Article 22 inventory to the AI Act. Systems subject to Article 22 automated decision-making restrictions are strong candidates for high-risk classification under the AI Act.
Use your Records of Processing Activities (ROPA). Your ROPA should already capture AI processing activities. Extend it to capture AI Act classification and compliance status.
One transparency notice, two frameworks. Draft a single transparency disclosure that satisfies GDPR information obligations and AI Act transparency requirements. Regulators in both frameworks expect plain-language explanation.
For most AI deployments in the EU, GDPR and the AI Act are not alternatives — they’re concurrent obligations. Building a joint compliance framework from the start is the practical path.
This article is for informational purposes only and does not constitute legal advice. Always consult qualified counsel before making compliance decisions. Try the free compliance checker →
