California Previews Draft Regulations for Automated Decision-Making Technology, Promising More to Come in 2024
On November 27, 2023, the California Privacy Protection Agency (CPPA) released draft regulations for automated decision-making technology (ADMT). While the CPPA has not officially begun the formal rulemaking process, the CPPA’s board will provide feedback on the proposed rules at the December 8, 2023 board meeting. The CPPA has stated that the agency expects to begin the formal rulemaking process next year. The CPPA also released draft regulations for risk assessments, which include provisions relevant to the use of ADMT.
The draft regulations follow growing federal and state responses to the explosive development of artificial intelligence (AI). California is often the first mover when it comes to novel state regulation of technology, and these draft regulations are likely the first ripples of more extensive state efforts to address AI technologies. A high-level summary is provided below. It will be important for businesses to continue to monitor developments in California, as these proposed regulations are far from final.
The draft regulations define ADMT as “any system, software, or process—including one derived from machine-learning, statistics, or other data-processing or artificial intelligence—that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decisionmaking. Automated decisionmaking technology includes profiling.” In turn, “profiling” means “any form of automated processing of personal information to evaluate certain personal aspects relating to a natural person and in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.”
Overall, these definitions are fairly broad, and would include AI technology in addition to potentially including a wide range of other technology that is used to help facilitate human decisions.
The companion draft risk assessment regulations propose to require a business to conduct a risk assessment if ADMT is used: (1) for a decision that produces a legal or similarly significant effect for a consumer; (2) profiling a consumer who is acting in an employment capacity, independent contractor, job applicant, or student; (3) profiling a consumer who is in a public place; or (4) profiling for behavioral advertising.
The draft risk assessment regulations provide another option, which would expand the requirement to conduct a risk assessment if personal information is used to train ADMT: (1) for the four purposes listed above; (2) to establish identity based on biometrics; (3) facial, speech, or emotion-detection; (4) to create deep fakes; or (5) for the operation of generative models (LLM).
The draft regulations would require businesses using ADMT to provide consumers with a “Pre-use Notice” of the consumer’s right to: (1) opt out of, and (2) access information about the business’ use of ADMT. Notably, the draft regulations would require the Pre-use Notice to describe the business’s purpose for the use of ADMT. The draft regulations state that the purpose shall not be described in “generic terms,” such as “to improve our services.”
The draft regulations would give consumers the right to opt out and the right to access information about a business’s use of ADMT in: (1) decisions that produce legal or similarly significant effects concerning a consumer; (2) profiling a consumer acting as an employee, independent contractor, job applicant, or student; and (3) profiling consumers in publicly accessible places. A “decision that produces legal or similarly significant effects concerning a consumer” means “a decision that results in access to, or the provision or denial of, financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment or independent contracting opportunities or compensation, healthcare services, or essential goods or services.”
The draft regulations also list additional options for board discussion, which include: profiling for behavioral advertising, profiling of consumers where the business has actual knowledge that the consumer is under the age of 16, and processing personal information to train ADMT.
Additionally, if use of ADMT results in a “decision that produces legal or similarly significant effects concerning a consumer,” then the business would be required to provide notice to the affected consumer that the decision was made and remind the consumer of their right to access. Additionally, the business would have to notify affected consumers that they can file a complaint with the CPPA and the Attorney General.
Additional Opt-Out Requirements: Businesses would be required to provide at least two designated opt-out methods. Consumers may use authorized agents to submit requests to opt out.
Additional Right-to-Access Requirements: Businesses would be required to provide potentially extensive information in the responses to requests to access, including:
- The purpose for which the business uses ADMT,
- The output of the ADMT,
- How the business used the output to make a decision regarding the consumer,
- Information related to business plans to use the output to make a decision regarding the consumer,
- How the ADMT worked with respect to the consumer,
- “A simple and easy-to-use method by which the consumer can obtain the range of possible outputs, which may include aggregate output statistics”,
- Instructions for how the consumer can exercise their other California Consumer Privacy Act (CCPA) rights, and
- Instructions for submitting complaints to the business or the CPPA and the Attorney General.
Service providers and contractors would also have obligations to provide assistance to businesses in responding to right-to-access requests.
Opt-Out: Businesses would not be required to provide consumers opportunities to opt out of ADMT if the technology is compliant with section 7002 of the CPPA regulations, which restricts business’s collection and use of personal information, and the use is both necessary and solely used for: security purposes, prevention of fraudulent or illegal actions, or to protect the life and physical safety of consumers. Additionally, businesses would not be required to opt consumers out of ADMT uses if the technology is used to provide a good or service specifically requested by the consumer. However, a business must have no reasonable alternative method of processing the information in order to qualify for this exception.
Access: Businesses are not required to respond to access requests if the use of ADMT is for the same security and safety purpose exceptions as the opt-out right.
Opt-In to Profiling for Behavioral Advertising for Minors
Parental Opt-In for Consumers Under 13: Businesses that have actual knowledge that they profile consumers younger than 13 for behavioral advertising would be required to establish and comply with a reasonable method for a parent or guardian to opt in to the use of profiling for behavioral advertising. This consent would be in addition to obligations codified in the Children’s Online Privacy Protection Act (COPPA).
Opt-In for Consumers between 13 and 16: For children older than 13 but younger than 16, businesses with actual knowledge of consumers in this age range would be required to establish and comply with a reasonable process for allowing such consumers to opt in to profiling for behavioral advertising. Upon receiving an opt-in request, the business would be required to inform the consumer of their right to opt out of this profiling at any time.
Wiley’s Privacy, Cyber & Data Governance and Artificial Intelligence (AI) teams assists clients with government advocacy, as well as compliance and risk management approaches to privacy compliance, AI technology, and algorithmic decision-making, including on compliance and risk management issues for generative AI. Please reach out to any of the authors with questions.