Skip to main content
Regulome
Search regulations…⌘K
For providersFree Checker
The Ledger · Thursday, 08 January 2026Issue № 21All issues →

AI Compliance Hub · newsroom

Enforcement Updates · 6 min read

Clearview AI GDPR Fines Across Europe: What the Enforcement Pattern Tells Us

Clearview AI has faced GDPR enforcement actions across the EU totaling hundreds of millions of euros in fines. Here’s what the cases reveal about how regulators are approaching AI and biometric data.

Clearview AI GDPR Fines Across Europe: What the Enforcement Pattern Tells Us
Enforcement UpdatesIllustration · AI Compliance Hub

Clearview AI’s facial recognition database has become the most prominent GDPR enforcement target in AI history. The company built a database of billions of facial images scraped from the internet without consent and licensed it to law enforcement. European regulators responded with coordinated enforcement that provides a roadmap for how GDPR applies to AI systems built on scraped data.


The Clearview Business Model

Clearview AI scrapes publicly available images from websites and social media platforms, extracts facial feature data (a biometric identifier under GDPR), and creates a searchable database. Law enforcement agencies can submit a photo and find matches across billions of images.

The model triggered immediate privacy concerns:

  • Biometric data was collected without consent
  • Individuals had no notice their images were in the database
  • No lawful basis under GDPR was established
  • The opt-out mechanism Clearview offered was inadequate under EU standards

The Enforcement Actions

Italy (2022): €20 million fine. Italian DPA ordered Clearview to stop processing Italian residents’ data and delete existing data.

France (2022): €20 million fine. French DPA (CNIL) found Clearview violated the GDPR’s lawful basis requirement, failed to respond to access requests, and failed to comply with deletion requests.

Greece (2022): €20 million fine. Hellenic DPA issued the maximum fine available.

UK (2022): £7.5 million fine (UK GDPR post-Brexit). ICO found the same violations. This was later reduced to £9 million after appeal.

Austria, Belgium, and others: Additional enforcement actions and compliance orders.

The total regulatory exposure across Europe exceeded €100 million in fines, plus compliance orders requiring data deletion and cessation of processing.


What the Cases Established

Scraping public images is not consent. The fact that an image is publicly accessible does not mean the person consented to biometric processing. The GDPR’s lawful basis for biometric data is strict — explicit consent or a limited set of statutory exceptions.

Geographic scope is extraterritorial. Clearview is a US company that never had a EU office. Regulators enforced GDPR based on the fact that EU residents were targeted. The GDPR applies wherever EU residents’ data is processed, regardless of where the processor is based.

Biometric data gets the highest protection. GDPR Article 9 places biometric data in the “special categories” of sensitive data that require explicit consent or a statutory basis. Generic “legitimate interests” claims don’t work.

Ignoring regulator correspondence is expensive. In multiple cases, Clearview’s failure to respond to subject access requests and regulator inquiries aggravated the penalties.


What This Means for AI Companies

The Clearview precedent applies broadly. Any AI system that processes biometric data of EU residents without adequate lawful basis faces the same theory of liability. This includes:

  • AI systems trained on scraped images
  • Facial recognition for marketing or analytics
  • Emotion detection from video

Scraping for AI training is legally contested. The Clearview cases are specifically about using scraped biometric data operationally. Training data scraping is a related but distinct legal question currently being litigated across multiple jurisdictions.

Enforcement coordination works. The European Data Protection Board coordinated the Clearview enforcement across member states. The AI Act creates a similar cross-border enforcement mechanism for high-risk AI.


The Broader Pattern

Clearview is the most visible case, but it’s part of a broader enforcement pattern:

  • Facial recognition in retail: Several EU retailers have been investigated for using facial recognition for loss prevention
  • Emotion recognition: Tools that infer emotional states from video are receiving increasing scrutiny
  • Biometric marketing: Tools that target advertising based on physical characteristics face legal challenges

The Clearview cases make the legal position clear: biometric AI that processes EU resident data without explicit consent and a lawful basis is GDPR-violating by design.

Tagged regulations
GDPRClearview AIBiometricsEnforcementEurope
AI Compliance Hub editors
The editorial desk covers AI and cyber regulation across the US, EU, and UK. Tips? editors@aicompliancehub.com
Not legal advice

This article is for informational purposes only and does not constitute legal advice. Always consult qualified counsel before making compliance decisions. Try the free compliance checker →

← Back to The Ledger

Keep the Ledger coming.

A weekly edition of new regulations, enforcement actions, and compliance deadlines — delivered every Friday. Free forever. No tracking pixels.

Subscribe free →

Read by 4,000+ compliance teams · Cancel any time