Information contained in this publication is intended for informational purposes only and does not constitute legal advice or opinion, nor is it a substitute for the professional judgment of an attorney.
Two noteworthy developments have occurred since the California Fair Employment & Housing Council released draft revisions to the state’s employment non-discrimination laws on March 15, 2022 that relate to the nascent law surrounding the use of artificial intelligence, machine learning, and other data-driven statistical processes to automate decision-making in the employment context.
First, on March 16, 2022, in Raines v. U.S. Healthworks Medical Group,1 the United States Court of Appeals for the Ninth Circuit certified to the Supreme Court of California the following question: Does California’s Fair Employment and Housing Act, which defines employer to include “any person acting as an agent of an employer,”2 permit a business entity acting as an agent of an employer to be held directly liable for employment discrimination?
The plaintiffs in Raines consist of a class of current and former job applicants who seek to hold defendants, who were providers of pre-employment medical screenings, liable for asking allegedly invasive and impermissible questions during medical screening exams. “The crucial question of state law is whether the Fair Employment and Housing Act (FEHA) allows employees to hold a business entity directly liable for unlawful conduct when the business entity acted only as the agent of an employer, rather than as an employer itself.”3 The district court ruled that plaintiffs had adequately alleged that defendants were the agents of prospective employers, but that FEHA does not impose direct liability on agents.4
As the Ninth Circuit aptly explained, “[w]hether FEHA’s definition of the term ‘employer’ includes a business entity acting as an employer’s agent is an unresolved question of law with significant public policy implications. California has millions of employees who could be impacted by a decision defining the scope of liability for business entities acting a agents of their employers.”5 In sum, the outcome of this appeal could bear directly upon the sweep of liability for third-party agents utilizing automated-decision systems to assist employers in decision-making processes.
Second, on March 25, 2022, the Fair Employment and Housing Council conducted a remote public workshop and review session, which included the working draft of employment regulations regarding automated-decision systems. The Council indicated during the session that the purpose of the draft regulations is to address the demonstrated potential of artificial intelligence, including algorithms, to unlawfully discriminate in at least the housing and employment contexts. Among other highlights, the Council made clear that it intends to encompass claims that could be brought under intentional discrimination and disparate impact theories within the ambit of the regulations. The Council also indicated that standard defenses to any allegation of employment discrimination, such as business necessity, would still apply to any revised regulations, but the Council is uncertain at this early juncture whether any new defenses would apply. Finally, the Council declined to set forth a timeframe for adopting the draft regulations, but rather emphasized that they are part of a pre-rulemaking “brainstorming" effort, and that they are not fully formed, nor is their implementation imminent.
Thus, while California largely remains in a “wait and see” position pending the outcome in Raines and as the Council finalizes its proposed regulations, what is clear is that it is simply a matter of time before employers need to prepare for heightened regulations regarding their use of artificial intelligence and machine learning—and potentially the engagement of vendors who deploy them—in their employment decision-making processes.
1 Raines v. U.S. Healthworks Medical Group, No. 21-55229 (9th Cir. Mar. 16, 2022).
2 Cal. Gov’t Code § 12926(d).
3 Raines, slip op. at 5.
4 Id. at 8.