ASAP
Colorado Amends its Artificial Intelligence Law, Substantially Reducing Obligations on Employers
At a Glance
- Colorado’s amended AI law (S.B. 26-189) significantly scales back employer obligations from the prior version and delays the effective date to January 1, 2027.
- The law applies to automated decision-making that materially influences major employment decisions and requires three main actions—clear notice to individuals, a structured adverse action and human review process, and retention of relevant records for at least three years.
- The Act limits liability based on fault between developers and employers, voids certain indemnification clauses, and is enforceable only by the Colorado attorney general (with no private lawsuits).
Colorado’s governor has signed an amendment (S.B. 26-189) to Colorado’s artificial intelligence law, substantially reducing the compliance burden on employers. The Colorado General Assembly passed the bill day before the end of the 2026 legislative session, and the governor signed less than two months before the previously enacted law was due to go into effect, on June 30, 2026. Passage of the amendment ends a nail-biting period for employers, which did not know whether to proceed with compliance implementation for the previously enacted law. The new law (the “CO AI Act” or the “Act”) is based closely on the bill proposed by the governor’s AI Policy Working Group and will go into effect on January 1, 2027. This is still a relatively short period for employers to implement the Act’s three requirements: pre-use notice; an adverse action process; and record retention.
When the CO AI Act applies to employers
The CO AI Act regulates the use of automated decision-making technology to “materially influence” a “consequential decision,” which includes “access to, eligibility for, selection for, or compensation for” employment (“Covered ADMT”).1 This standard appears to cover both internal and external hiring, as well as decisions about eligibility for hire. However, the definition of “consequential decisions” explicitly excludes identity verification, activities related to cybersecurity and sanctions compliance, and low-stakes or routine decisions, actions and business processes, including routine scheduling, administrative routing, customer service, communication of decisions, and workflow management. The Act also does not apply to decisions about independent contractors or to applicants and employees who are not residents of Colorado. Only employers that “do business in Colorado” need comply.2
Unlike many of the new laws in this area, including Colorado’s previous law, the CO AI Act is not limited to artificial intelligence, which is generally defined to include some inferential step. Rather, “automated decision-making technology” is defined to include any “technology that processes personal data and uses computation to generate output….”3 This definition covers a broad range of computational technologies, many of which have been used by employers long before the recent explosion of artificial intelligence tools.
However, the new law explicitly carves out a wide range of clerical tools by excluding from the definition calculators, databases, spreadsheets that require human analysis, and tools used “solely to summarize, organize, translate, … or present information for human review of administrative processing.”4 These likely include transcription and note-taking tools, often used by employers to take notes on applicant interviews. The definition also excludes natural language processing and other ChatGPT-like tools, as long as not intended to be used for consequential decisions and subject to an acceptable use policy that prohibits such use.
What the CO AI Act requires from employers
In short, the CO AI Act requires that employers (1) provide notice before using Covered ADMT, (2) implement a robust adverse action process, including notice, a right to correct, and a right to meaningful human review, and (3) retain records about the use of Covered ADMT for three years.
Prior Notice:
Prior to using Covered ADMT, the employer must provide a “clear and conspicuous” notice that the employer uses automated decision-making technology to materially influence a consequential decision, as well as how the individual may obtain additional information.5 The Act provides no further requirements for the content of this pre-use notice, albeit the Colorado attorney general’s implementing regulations may do so.
By contrast, the Act provides that the “clear and conspicuous” standard requires a “prominent public notice that is reasonably accessible at points of consumer interaction.”6 A “link or post that is reasonably proximate to the interaction or transaction in which a consequential decision may occur” is one example of “clear and conspicuous” notice. An employer likely could satisfy this standard by, for example, providing the pre-use notice as part of the online job application process. For consequential decisions in the context of the employment relationship, the public posting requirement makes little sense, so employers likely will need to supplement any public posting with one that is closer in time to the decision but not public—for example, at the point where employees submit a self-evaluation in the performance evaluation process.
Adverse Action Process:
Within 30 days of making an adverse decision materially influenced by a Covered ADMT’s output, the employer must provide a notice that includes:
- A plain language description of the consequential decision and the role the Covered ADMT played in that decision;
- Instructions and a simple process to request additional information about the Covered ADMT and the inputs, including the name of the Covered ADMT, the Covered ADMT version number, if applicable, the Covered ADMT developer, and the types, categories, and sources of personal data used; and
- An explanation of the right to correct and review, as well as how to exercise these rights.
In addition, in response to the individual’s request, the employer must provide:
- Instructions for requesting personal data and correcting factually incorrect or materially inaccurate personal data used in the consequential decision; and
- An opportunity for meaningful human review and reconsideration of the consequential decision.
Of these, the most burdensome requirement for employers may be the right to human review and consideration. However, this right is substantially limited by the caveat that the employer need only comply with this right “to the extent commercially reasonable.”7 The scope of this “commercially reasonable” exception will be a key issue for employers. In many cases, implementing an automated decision-making technology may not be worth the cost of implementation where the employer must also set up a parallel process for meaningful human review and reconsideration. In that case, employers may have a reasonable argument that compliance with the review right is not “commercially reasonable.”
Record Retention:
The CO AI Act requires employers to retain for at least three years after making a consequential decision (and longer if required by some other state or federal law) records “reasonably necessary to demonstrate compliance” with the Act.8 The Act’s examples of records to be retained include “covered ADMT version identifiers, changelogs, and documentation of material mitigation changes,” as applicable.9 Other relevant records most likely would include the relevant version of the pre-use notice, a record of how the pre-use notice was communicated before the adverse decision was made, the adverse action notice provided to the individual, and any records of the decision-making process.
How the CO AI Act addresses liability and indemnification
The CO AI Act splits liability for any potential algorithmic discrimination claim between developers and deployers. The CO AI Act clarifies that both the developer and the deployer may be held liable for unlawful discrimination, but only to the extent of their relative fault—in other words, if a deployer uses a Covered ADMT in the way it was “intended, documented, marketed, advertised, configured or contracted” to be used by the developer and the results are still discriminatory, then the developer would be liable. If a deployer uses a Covered ADMT in a way that it was not “intended, documented, marketed, advertised, configured or contracted” to be used by the developer, then the developer would not be liable, but the deployer would be. The CO AI Act does not create joint and several liability, except to the extent permitted under existing law.
The CO AI Act also invalidates contractual indemnification provisions against liability for the developer’s or deployer’s own acts or omissions related to the use of Covered ADMT in violation of Colorado’s discrimination laws. In essence, indemnification clauses that purport to shift a party’s liability for unlawful algorithmic discrimination in consequential decisions onto the other are void as against public policy. This restriction does not apply if the Covered ADMT was used in an unintended manner and the developer complied with its documentation obligations.
In light of these expected changes, developers and deployers should review their existing contracts, with a critical eye towards any language that may run afoul of the new limitations on indemnification provisions.
How the CO AI Act is enforced
Crucially, the CO AI Act does not create a new private right of action. As with the previous Colorado AI law, only the Colorado attorney general can enforce this law. Violations constitute unfair and deceptive trade practices, which carry potential civil penalties of up to $20,000 for each violation.10 Before initiating an enforcement action, the attorney general must provide a 60-day notice of violation and opportunity to cure, but only if the attorney general deems a cure possible.11
Looking ahead, the Act contemplates additional regulatory development. The Colorado attorney general is required to promulgate implementing regulations by January 1, 2027, signaling that the current statutory framework is likely to evolve as enforcement priorities and practical considerations take shape.
Comparison to other AI laws and the previous Colorado AI law
The CO AI Act eliminates many of the most onerous obligations of Colorado’s previous artificial intelligence law. Specifically, the CO AI Act removes the following mandates that the old law imposed on employers using AI as a substantial factor in employment decisions:
- Reporting findings of discriminatory outcomes to the Colorado attorney general;
- Conducting impact assessments;
- Implementing a risk management policy and program;
- Conducting annual reviews of AI tools;
- Posting or updating privacy policies to describe the use of AI tools;
- Providing notice when interacting with an AI system; and
- Affirmatively avoiding algorithmic discrimination (instead the CO AI Act relies on a prohibition on violating existing state and federal anti-discrimination laws).
The CO AI Act is the third comprehensive AI law applicable to employers in the United States, joining California and New York City by combining notice, rights, and anti-discrimination provisions. Other laws, such as Illinois’s AI provisions in its human rights law, only require notice for the use of AI in employment decisions. In addition, a growing number of state statutes simply clarify that anti-discrimination laws apply to employers when they rely on AI tools that result in discrimination against applicants or employees.
Meanwhile, dozens of additional AI-related bills remain under active consideration in state legislatures nationwide. Given the lack of legislative action at the federal level, employers should expect more state laws and increased complexity in the area of AI.