Skip to main content
Regulome
Search regulations…⌘K
For providersFree Checker
The Ledger · Tuesday, 10 March 2026Issue № 29All issues →

AI Compliance Hub · newsroom

Regulation Analysis · 8 min read

CCPA ADMT Final Rules: What AI Teams Need to Know

California’s Automated Decision-Making Technology rules took effect in 2026. Here’s what the final rules require, who they cover, and what your AI team needs to do to comply.

CCPA ADMT Final Rules: What AI Teams Need to Know
Regulation AnalysisIllustration · AI Compliance Hub

After years of rulemaking, California’s Automated Decision-Making Technology (ADMT) regulations — part of the California Consumer Privacy Act framework — are now in force. The rules give California residents new rights regarding AI systems that make or significantly influence decisions about them.


What the ADMT Rules Regulate

The ADMT rules apply to “automated decision-making technology”: any system that uses computation to make or significantly contribute to decisions about people. This includes:

  • Credit and insurance decisioning systems
  • Hiring and HR AI systems
  • Healthcare treatment recommendation systems
  • Personalization systems that affect access to services or pricing
  • Content moderation systems

The critical phrase is “significantly contribute to.” Even if a human makes the final call, if an AI system meaningfully shapes what options that human considers or recommends a specific action, it may be covered.


The New Consumer Rights

The ADMT rules create three new rights for California residents:

Right to Opt Out

Consumers have the right to opt out of their personal information being used for ADMT. This applies when:

  • The ADMT is used for “significant decisions” about the consumer, OR
  • The ADMT is used for “extensive profiling” of the consumer

Significant decisions include decisions regarding employment (hiring, promotion, termination), credit (eligibility, terms), housing, insurance, healthcare, and access to services.

Opt-out mechanics: You must provide a clear and conspicuous opt-out mechanism. A link in your privacy policy footer is not sufficient. The opt-out must be obvious and usable.

Consequences of opt-out: When a consumer opts out, you cannot use their personal information in the ADMT for covered purposes. If you can’t serve them without ADMT, you must offer an alternative or inform them they can’t receive the service.

Right to Access

Consumers have the right to know:

  • Whether ADMT is being used to make decisions about them
  • What the ADMT does (general explanation, not proprietary model details)
  • What logic is used in a general sense
  • What information is used as input

Right to Correction

If ADMT uses inaccurate information to make decisions, consumers have the right to correct that information, which may require re-running the ADMT with corrected inputs.


What Businesses Must Do

Update Privacy Notices

Your privacy notice must disclose:

  • Whether you use ADMT for significant decisions about consumers
  • What types of decisions ADMT is used for
  • How consumers can exercise their rights

Build Opt-Out Mechanisms

For covered ADMT uses, implement:

  • A clear opt-out mechanism accessible before the ADMT decision is made
  • A process for honoring opt-outs promptly
  • A process for documenting opt-outs and ensuring ADMT systems respect them

Conduct Pre-Use Risk Assessments

For “high-risk” ADMT uses (significant decisions + sensitive data), businesses must conduct risk assessments before implementing the ADMT. The assessment must evaluate:

  • The purpose and necessity of the ADMT
  • The risks to consumers, including bias risks
  • Mitigations implemented
  • The assessment must be maintained and made available to the CPPA on request

Train Staff

Staff involved in ADMT use, oversight, and consumer rights handling must be trained on the new requirements.


The Significant Decisions Threshold

Not all AI is covered — only ADMT used for significant decisions or extensive profiling. This scope question is where most compliance ambiguity lives.

Clearly in scope:

  • Credit scoring used in lending decisions
  • AI resume screening used in hiring
  • Insurance underwriting models
  • Predictive health risk scoring used in treatment decisions

Potentially in scope (context-dependent):

  • Content recommendation systems that affect what information consumers see
  • Dynamic pricing systems that affect what consumers pay
  • Customer risk scoring used in service access decisions

Likely out of scope:

  • Fraud detection AI where the consumer is protected (not harmed) by the decision
  • Purely internal analytics not used in consumer-facing decisions
  • Aggregate analytics without individual decision-making

Key Dates

The final ADMT rules were adopted July 2025 and approved by OAL September 2025. General ADMT rules took effect January 1, 2026. Specific obligations for high-risk ADMT uses take full effect April 1, 2027, giving businesses additional time to implement risk assessments.


What to Do Now

  1. Inventory your AI systems and identify which qualify as ADMT used for significant decisions
  2. Update privacy notices to disclose ADMT use
  3. Design and implement opt-out mechanisms for covered ADMT
  4. Begin risk assessment process for high-risk ADMT uses (due April 2027, but start now)
  5. Review vendor agreements for ADMT services you receive from third parties
Tagged regulations
CCPAADMTCaliforniaAutomated DecisionsPrivacy
AI Compliance Hub editors
The editorial desk covers AI and cyber regulation across the US, EU, and UK. Tips? editors@aicompliancehub.com
Not legal advice

This article is for informational purposes only and does not constitute legal advice. Always consult qualified counsel before making compliance decisions. Try the free compliance checker →

← Back to The Ledger

Keep the Ledger coming.

A weekly edition of new regulations, enforcement actions, and compliance deadlines — delivered every Friday. Free forever. No tracking pixels.

Subscribe free →

Read by 4,000+ compliance teams · Cancel any time