Skip to main content
Regulome
Search regulations\u2026\u2318K
For providersSign in
Compliance Guides10 min read

Colorado AI Act 60-Day Compliance Checklist (SB 24-205)

75 days until enforcement. Use this step-by-step checklist to get your business compliant with the Colorado AI Act before the June 30, 2026 deadline.

Colorado AI Act 60-Day Compliance Checklist (SB 24-205)

The Colorado AI Act (SB 24-205) takes effect June 30, 2026 — 75 days from now. If your business deploys AI systems that make consequential decisions about Colorado consumers, you have a legal obligation. Penalties reach $20,000 per violation per consumer.

This colorado ai act compliance checklist breaks the work into five phases you can execute over the next 75 days. Work through each section in order. Check off each item as you complete it.


What Is the Colorado AI Act (SB 24-205)?

Colorado SB 24-205 — signed by Governor Polis on May 17, 2024 — is the first comprehensive US state AI law to take effect. It requires businesses that deploy high-risk AI systems affecting Colorado consumers to use reasonable care to prevent algorithmic discrimination.

The law covers deployers (businesses using AI to make decisions) and developers (businesses building or selling AI systems). Most compliance obligations fall on deployers.

Enforcement begins June 30, 2026. The Colorado Attorney General has exclusive enforcement authority — there is no private right of action. However, the AG must issue a 60-day cure notice before initiating formal enforcement, so businesses that have made a documented good-faith compliance effort have a meaningful opportunity to correct issues.


Who Must Comply?

Deployers — You Are Covered If:

You use an AI system that makes, or is a substantial factor in making, a consequential decision affecting a Colorado consumer. Consequential decisions include:

  • Employment: Hiring, promotion, termination, compensation, scheduling
  • Credit and Lending: Loan approvals, credit limits, interest rates, lease decisions
  • Education: Admissions, financial aid, academic evaluation, credentialing
  • Healthcare: Diagnosis, treatment recommendations, medication decisions
  • Housing: Rental applications, purchase decisions, pricing
  • Insurance: Applications, underwriting, claims decisions, pricing
  • Legal Services: Legal representation, referrals, bail, sentencing (if applicable)
  • Government Services: Access to essential government benefits or services

You do not need to be headquartered in Colorado. If your AI affects Colorado residents in these domains, you are covered.

Developers — You Are Covered If:

You build, train, or sell a high-risk AI system to deployers. Your obligations are narrower: primarily documentation and disclosures to deployers.

Small Business Exemption (Narrow)

The law includes scaled obligations for small businesses, but the thresholds are narrow. Review the exemption carefully with counsel — do not assume you qualify.


The Colorado AI Act 60-Day Compliance Checklist

Phase 1: Inventory and Scoping (Days 1–10)

  • Identify every AI system your organization uses that makes or influences decisions about individuals
  • For each system, determine: does it make consequential decisions (employment, credit, housing, healthcare, education, insurance, legal, or government services)?
  • Determine whether any affected individuals are Colorado consumers (residents — this is broader than Colorado employees)
  • Classify each covered system as high-risk or not, and document your reasoning
  • Build a simple AI inventory spreadsheet: system name, vendor, domain, risk classification, compliance owner
  • Assign a compliance owner for each high-risk AI system
  • Brief your legal, HR, engineering, and product teams on the law's requirements

Phase 2: Documentation and Vendor Due Diligence (Days 5–20)

  • For each high-risk AI system you use, request the following from your AI vendor in writing:
  • General description of the system's purpose and capabilities
  • Known risks of algorithmic discrimination and how they are mitigated
  • Training data sources and validation methodology
  • How the vendor supports deployer compliance (documentation, audit trail, explanation APIs)
  • Review your contracts with AI vendors — add representations about AI Act compliance if missing
  • Confirm whether your vendor has conducted or can provide bias testing results for the system
  • If a vendor cannot provide adequate documentation, escalate to legal — this is a compliance gap
  • For internally built AI systems, document the above directly from your engineering and data science teams

Phase 3: Impact Assessments (Days 10–40)

  • Complete a written impact assessment for each high-risk AI system. Each assessment must document:
  • The intended purpose of the system and its reasonably foreseeable uses
  • Known and reasonably foreseeable risks of algorithmic discrimination
  • Categories of training data used and how data quality was ensured
  • How the system was evaluated for discriminatory outcomes (bias testing methodology)
  • What transparency and explainability measures are in place
  • How human oversight is implemented before consequential decisions are final
  • How the organization will monitor for disparate impact in production
  • Any mitigation controls in place to reduce discrimination risk
  • Have your impact assessment reviewed by legal counsel before the June 30 deadline
  • Set a calendar reminder to update impact assessments annually and within 90 days of any material system change

Phase 4: Consumer-Facing Compliance (Days 25–50)

  • Draft consumer notification language disclosing that a high-risk AI system was used in a consequential decision
  • Draft plain-language explanation language describing how the AI contributed to the decision
  • For adverse decisions (denial, rejection, unfavorable outcome), prepare language explaining which factors led to the adverse result
  • Design or document an opt-out mechanism for consumers who do not want AI-assisted decisions
  • Design or document an appeal or human review process for consumers who wish to contest AI-influenced decisions
  • Have consumer-facing notification and explanation templates reviewed by legal before launch
  • Implement notification delivery in your systems (email, in-app, letter, or other appropriate channel for your use case)

Phase 5: Governance and Ongoing Program (Days 40–60)

  • Establish a written AI risk management policy that includes:
  • Scope (which AI systems are covered)
  • Principles for responsible AI use
  • Approval process for new AI tools
  • Vendor due diligence requirements
  • Monitoring and incident response procedures
  • Consumer rights and appeal process
  • Assign executive ownership for the AI compliance program
  • Set up a quarterly monitoring process for each high-risk system (check for model drift, disparate impact, and output anomalies)
  • Train relevant employees on the policy and their obligations
  • Document your compliance program — a written, dated record of your good-faith effort is your most important protection if the AG initiates a review
  • Consider adopting NIST AI RMF as your governance framework — the Colorado AG has indicated it as a best-practice reference, and compliance with a recognized framework is an affirmative defense under the law

Key Definitions

High-Risk AI System: An AI system that makes, or is a substantial factor in making, a consequential decision. The key word is "substantial factor" — a system that significantly influences the decision (even if a human approves it) may still qualify as high-risk.

Algorithmic Discrimination: Any condition in which the use of an AI system results in unlawful differential treatment or impact that disfavors an individual based on a protected characteristic including age, color, disability, ethnicity, genetic information, limited English proficiency, national origin, pregnancy, race, religion, sex, veteran status, or other protected class.

Consequential Decision: A decision that has a material legal or similarly significant effect on an individual's access to or the terms of a specific set of opportunities or services, including the categories listed above.

Deployer: A person doing business in Colorado that deploys a high-risk AI system in a product or service.

Developer: A person doing business in Colorado that develops or substantially modifies a high-risk AI system for deployment.


Penalties for Non-Compliance

ItemDetail
Maximum civil penalty$20,000 per violation
How violations are countedPer consumer or per transaction — penalties can compound rapidly across large user bases
Enforcement authorityColorado Attorney General (exclusive — no private right of action)
Cure periodAG must provide 60 days' written notice before initiating enforcement
Good-faith defenseCompliance with a recognized AI risk management framework (e.g., NIST AI RMF) is an affirmative defense

The cure period means early enforcement will likely target businesses that have made no compliance effort at all, rather than those with documented programs that have minor gaps.


Frequently Asked Questions

Does this apply to us if we're not in Colorado?

Yes. The law applies based on where your consumers are located, not where your business is headquartered. If your AI system makes consequential decisions affecting Colorado residents, you are covered regardless of your business location.

Our AI vendor says they're compliant — does that cover us?

No. Under the Colorado AI Act, deployers are responsible for compliance — not the vendor. You must conduct your own impact assessments and implement consumer notification and opt-out mechanisms. Vendor documentation helps you complete your assessment, but it does not substitute for it.

What if a human makes the final decision, not the AI?

It depends. If the AI output is a "substantial factor" in the human's decision — for example, a resume screening score that determines which candidates are reviewed — the system is likely still high-risk. If the AI is purely advisory and the human independently reviews all candidates regardless, the analysis is different. Get legal guidance on your specific workflow.

We're a small business — are we exempt?

Possibly, but the exemption is narrow. Review it carefully with counsel. Do not assume you qualify based on headcount or revenue alone.


How regulome.io Can Help

Not sure if the Colorado AI Act applies to your business? Use our free compliance checker — answer four questions, get an instant analysis of which AI laws apply to you and what you need to do.

Check My Compliance — Free →

Need a full compliance roadmap? Our Pro Report ($49) gives you a personalized analysis of every AI law applicable to your business, prioritized by deadline and risk level — including the Colorado AI Act, EU AI Act, NYC Local Law 144, and emerging state laws.

Get Your Pro Report →

Need an AI governance consultant or bias auditor? Browse our directory of verified providers, filterable by jurisdiction and service type.

Browse the Directory →

Colorado AI ActChecklistJune 2026SB 205

Not sure which AI laws apply to your business?

Use our free compliance checker — answer 4 questions, get instant results.

Check My Compliance
← Back to blog

Not legal advice. This article is for informational purposes only. Always consult a qualified attorney for compliance decisions.