The California Privacy Protection Agency released proposed regulations in November 2024 that will, if finalized, create significant new hurdles for employers using artificial intelligence to assist with a variety of employment decisions.
New regulatory burdens would apply to businesses that deploy “automated decisionmaking technology” (known as “ADMT”) to either make decisions or “substantially facilitate human decisionmaking” with respect to “significant decisions” about job applicants, employees, or even independent contractors. The proposed regulations broadly define "significant decisions" for employees to include the following:
- Hiring
- Allocation or assignment of work
- Decisions regarding salary, hourly, or per-assignment compensation, incentive compensation such as bonuses, or other benefits
- Promotions
- Demotions, suspensions, and terminations.
ADMTs are likewise broadly inclusive of “any technology that processes personal information and uses computation to execute a decision, replace human decisionmaking, or substantially facilitate human decisionmaking.” ADMTs can include machine learning systems, artificial intelligence, or even statistics. For example, a business may use a spreadsheet to track managers’ personal information, but if it then uses a regression analysis to determine common characteristics and then finds co-occurrences of these characteristics among junior employees to identify which of them it will promote, this would be an automated decisionmaking technology covered by the regulations.
Risk assessments
Under Section 7150 of the proposed regulations, businesses that use ADMT for significant decisions about their job applicants, employees, or independent contractors must conduct a risk assessment before they begin processing. Risk assessments include many of the items that companies have come to adopt in their privacy impact assessments, including information about the nature, purpose, and risks of the processing. For employers, however, the CPPA's proposed regulations go a step further by requiring a business to identify potential "d]iscrimination upon the basis of protected classes that would violate federal or state antidiscrimination law." The agency appears motivated to address concerns reported by some companies using AI and machine learning to screen resumes. For example, there have been reports that the systems began to select candidates based on protected characteristics, or proxies for those characteristics.
Employers familiar with New York Local Law 144 regarding automated employment decision tools may recall that Local Law 144 requires employers to use independent auditors to conduct a bias audit and to publish the results on their website before using such tools to screen job applicants. Although the language of the proposed California regulations does not appear to require the same level of auditing, employers should explore the various ways that artificial intelligence is being used in their organizations for employment decisions and weigh whether audits or other statistical assessments would help to validate that the systems do not have any disparate impact under federal or state anti-discrimination laws.
Pre-use notice
Section 7220 of the California proposed regulations would require businesses to provide a written notice to applicants, employees, and independent contractors before using an ADMT to process their data for a significant decision. The pre-use notice must include the following:
- A plain-language explanation of the specific purpose for which the business uses ADMT
- A description of the employee’s right to opt out of use of the ADMT with respect to decisions about them, as well as how to submit a request to opt out
- Information on the right to make a request about the ADMT’s use ("requests to access ADMT”)
- A notice that the business is prohibited from retaliating against the employee for exercising their rights.
Businesses should evaluate whether to include specific information about the use of AI or ADMTs in their recruiting and employee management processes.
Access rights to ADMT
Under proposed Section 7222, businesses using ADMT would also be required to provide "access" to the ADMT, meaning that consumers can request information about
- The output of the ADMT in the applicant or employee’s case,
- How the business used the output with respect to the applicant or employee, and
- How the ADMT worked with respect to the applicant or employee, including information such as how the system’s logic was applied to the employee, and the key parameters that affected the output with respect to the applicant or employee personally.
This requirement underscores the growing emphasis on transparency and accountability in the use of AI systems. However, parsing how AI and machine learning systems render output for individual cases has been difficult, given that they rely on exceedingly complex interactions between many variables. If the regulations go into effect in their current form, employers and software developers are likely to struggle to provide the system logic and the key parameters about individual decisions. Employers should work with their vendors to assess the underlying ability of their software to explain the output of artificial intelligence systems.
Next steps
The CPPA's proposed regulations signal a significant shift in regulatory expectations for businesses using artificial intelligence in hiring and employment decisionmaking processes. Government agency rulemaking, including that of the CPPA, typically involves multiple stages, often incorporating feedback from public comments before rules can be finalized. Thus, these proposed regulations may evolve significantly before they become final.
Employers should continue to closely monitor the developments, conduct a system inventory to identify any technology that they currently use that might fall under the scope of the ADMT regulations for employment decisions, and assess whether they may in the future need to update employee privacy policies, evaluate vendors' technology and compliance, conduct bias impact audits, or implement new processes for responding to applicant and employee questions about the use of AI for employment decisions.
The CPPA is accepting written comments until February 19. Comments may be mailed to the agency’s office or emailed to regulations@cppa.ca.gov. Note that any comments submitted are public records and will be published on the CPPA’s website.
- Senior Counsel
Robert is an experienced employment law attorney with a focus on advising companies how to navigate complex employment compliance issues, including how to avoid costly wage and hour litigation, creating and scaling up “best ...
The Constangy Cyber Advisor posts regular updates on legislative developments, data privacy, and information security trends. Our blog posts are informed through the Constangy Cyber Team's experience managing thousands of data breaches, providing robust compliance advisory services, and consultation on complex data privacy and security litigation.
Subscribe
Contributors
- Suzie Allen
- John Babione
- Matthew Basilotto
- Bert Bender
- Ansley Bryan
- Jason Cherry
- Christopher R. Deubert
- Maria Efaplomatidis
- Sebastian Fischer
- Laura Funk
- Lauren Godfrey
- Taren N. Greenidge
- Chasity Henry
- Julie Hess
- Sean Hoar
- Donna Maddux
- David McMillan
- Victoria Okraszewski
- Ashley L. Orler
- Todd Rowe
- Melissa J. Sachs
- Allen Sattler
- Brent Sedge
- Ryan Steidl
- Matthew Toldero
- Alyssa Watzman
- Aubrey Weaver
- Robert R. Wennagel
- Rob Yang
- Xuan Zhou
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023