State Authority to Regulate Artificial Intelligence in Employment Decisions
Summary
This opinion from the California Attorney General examines the state's authority to regulate the use of artificial intelligence and automated decision-making tools in employment contexts. It analyzes existing antidiscrimination statutes, the California Consumer Privacy Act, and the state's Fair Employment and Housing Act as applied to AI-driven hiring and promotion decisions.
The opinion discusses the legal theories under which employers may be held liable for discriminatory outcomes produced by AI tools, including disparate impact analysis and the duty to validate selection procedures. It examines federal preemption considerations and the relationship between state regulation and federal antidiscrimination law.
The opinion concludes that California has substantial authority to regulate AI in employment, recommending that employers conduct regular bias audits, provide notice to applicants and employees when AI tools are used, and maintain records sufficient to demonstrate compliance with antidiscrimination requirements.
Full Opinion Analysis
Background
The use of artificial intelligence and automated decision-making systems in employment has expanded rapidly. Employers and their vendors deploy AI tools for resume screening, candidate assessment, interview analysis, performance evaluation, and promotion recommendations. These systems promise greater efficiency and objectivity but carry significant risks of perpetuating or amplifying existing biases. Studies have documented instances where AI hiring tools discriminated against women, older workers, individuals with disabilities, and racial minorities, often because the training data reflected historical patterns of discrimination.
California has been at the forefront of regulating both artificial intelligence and employment practices. The state's Fair Employment and Housing Act (FEHA) is one of the most comprehensive antidiscrimination statutes in the nation, covering a broader range of protected characteristics and providing stronger remedies than federal law. The California Consumer Privacy Act (CCPA), as amended by the California Privacy Rights Act (CPRA), grants consumers rights regarding automated decision-making technology. This opinion examines how these existing legal frameworks apply to AI in employment and what additional regulation may be warranted.
Legal Analysis
Under FEHA, employers are liable for employment practices that have a disparate impact on protected groups unless the practice is job-related and consistent with business necessity. This framework applies regardless of whether the discriminatory outcome is produced by a human decision-maker or an automated system. An employer that uses an AI screening tool that disproportionately excludes women or minorities from the applicant pool bears the same legal liability as if a human recruiter had applied a discriminatory criterion. The employer cannot escape liability by claiming ignorance of the algorithm's decision-making process or by attributing the outcome to a third-party vendor's tool.
The duty to validate selection procedures is established under both federal EEOC guidelines and California's own regulations. When an employment test or selection procedure produces adverse impact, the employer must demonstrate that the procedure is valid, meaning it accurately predicts job performance and is not more exclusionary than alternative procedures that would serve the same purpose. AI hiring tools are subject to these validation requirements, and employers must be prepared to demonstrate that their AI systems have been validated for the specific positions and populations to which they are applied. The opacity of many AI systems, often referred to as the "black box" problem, makes validation particularly challenging but does not excuse the employer from compliance.
The CCPA and CPRA provide additional protections specific to automated decision-making. Consumers have the right to opt out of automated decision-making technology that produces legal or similarly significant effects, and businesses must provide meaningful information about the logic involved in such decision-making processes. In the employment context, these provisions intersect with FEHA's antidiscrimination protections, creating a layered regulatory framework. The opinion addresses federal preemption concerns, concluding that state regulation of AI in employment is not preempted by federal antidiscrimination law, which establishes a floor rather than a ceiling for worker protections.
Conclusion
California has substantial authority to regulate the use of AI in employment decisions under existing state law. Employers that deploy AI tools bear the same antidiscrimination obligations as those that rely on human decision-makers, including the duty to validate selection procedures and to remedy disparate impact. The state should exercise its regulatory authority to require bias audits, transparency in AI-assisted decision-making, and adequate record-keeping to facilitate enforcement of antidiscrimination protections.
Practical Impact
This opinion puts California employers on notice that AI hiring and promotion tools are subject to the same antidiscrimination scrutiny as traditional employment practices. Companies should conduct regular bias audits of their AI systems, maintain documentation of validation studies, and ensure that applicants and employees are informed when AI tools play a significant role in employment decisions. AI vendors serving the California market should expect increased scrutiny and should be prepared to demonstrate the fairness and validity of their products. Employment attorneys should advise clients to treat AI system deployment as a compliance-sensitive activity requiring legal review.
Disclaimer: This is a summary of an Attorney General opinion provided for informational purposes. AG opinions represent the legal interpretation of the issuing office and do not constitute binding judicial precedent. Consult a qualified attorney for legal advice.
This is legal information, not legal advice. Laws vary by jurisdiction and change frequently. Always verify current law with official sources and consult a licensed attorney in your jurisdiction for advice on your specific situation.