This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

Technology Law

| 8 minute read

California Releases Draft Comprehensive Regs Governing Artificial Intelligence and Automated Decisionmaking Technology

On November 27, the California Privacy Protection Agency (CPPA) released its long-anticipated draft regulations governing the use of automated decisionmaking technology (ADMT). The CPPA is set to discuss the draft ADMT regs at its next board meeting on December 8. As predicted, the draft ADMT regs are extremely comprehensive, and arguably create the first U.S. state framework governing the use of artificial intelligence (AI). Below are some quick takeaways from my initial read of the draft ADMT regs. This list is not comprehensive, and you should speak with legal counsel regarding potential implications. If you have any questions or thoughts, please send me a message. 

An Important Read

The draft ADMT regs provide the first in-depth insight into how the CPPA intends to regulate ADMT and AI. At 17 pages, the draft ADMT regs lay out comprehensive obligations regarding ADMT. Along with the draft ADMT regs, the CPPA published a presentation regarding ADMT, which is worth a read. The CPPA also published revised drafts of its cybersecurity audit regs and risk assessment regs, both of which were updated to further address ADMT and better correspond with the ADMT regs. Notably, the draft ADMT regs only reference AI once; however, the risk assessment regs continue to include a definition and express obligations around AI. The interplay between these documents strongly suggests that the ADMT regs are intended to cover AI - as we all know, AI is on the minds of regulators and consumers.

As with the draft cybersecurity audit regs and risk assessment regs, the draft ADMT regs come with a disclaimer that they were prepared by subcommittee and the CPPA has not yet started its formal rulemaking process. Despite the disclaimer, I expect the draft regs to closely resemble the eventual final version. The CPPA has historically stuck with most of its proposed language. Further, the updated versions of the draft cybersecurity audit regs and risk assessment regs carry over most of the proposed language from the initial drafts. For takeaways on the revised draft cybersecurity audit regs, see our prior post here.

ADMT Framework: Three Requirements

The ADMT regs primarily focus on a new proposed framework governing the use of ADMT. There are three main requirements: 

(1) Pre-Use Notice Requirement: Under this requirement, any business that uses a covered ADMT must provide consumers with certain disclosures regarding use of the ADMT.

(2) Opt-Out Requirement: Under this requirement, any business that uses a covered ADMT must provide consumers with the ability to opt-out of their personal information being processed using the ADMT. 

(3) Access Right Requirement: Under this requirement, any business that uses a covered ADMT must provide consumers with the ability to request details about the business’s use of the ADMT to process their personal information.

I expect the CPPA to spend significant time at its next board meeting discussing the ADMT framework. As California often leads US regulatory compliance, the final version of this ADMT framework may serve as a model for other states as they implement their own ADMT and AI laws.

Threshold for Requirements

The good news from a business compliance perspective is that the ADMT framework only applies to certain covered processing operations:

First, there must be an ADMT involved. This is not a high bar to meet. Pursuant to the ADMT regs, an ADMT is “any system, software, or process—including one derived from machine-learning, statistics, or other data-processing or artificial intelligence— that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decisionmaking.” 

Second, the ADMT must be involved in at least one of the following uses (pursuant to Section 7030(b)): 

(1) A decision that produces legal or similarly significant effects concerning a consumer (such as decisions to provide or deny employment opportunities)

(2) Profiling a consumer who is acting in their capacity as an employee, independent contractor, job applicant, or student

(3) Profiling a consumer while they are in a publicly available space

(4) Profiling a consumer for behavioral advertising 

(5) Profiling a consumer that the business has actual knowledge is under the age of 16

(6) Processing the personal information of consumers to train ADMT

If you have been following privacy law over the past decade, none of these should be surprising. All six of these processing activities are expressly identified in the draft risk assessment regs as “present[ing] significant risk to consumers’ privacy”, which requires the conducting of a risk assessment. (6) feels like a catch-all for AI, in particular.

Notably, the ADMT regs specify that (4), (5), and (6) are options for board discussion. Again, given that the CPPA has historically stuck with most of its proposed language, I expect this language (or similar language) to be in the final version. 

Detailed Requirements

Below are some relevant details for each of the requirements under the ADMT framework. 

Pre-Use Notice Requirement

The pre-use notice requirement is similar to the “notice at collection” requirement under the original CPRA regs. Under the pre-notice requirement, any business that uses a covered ADMT must inform consumers of the following:

  • that the business uses the ADMT,
  • the purpose of the use of the ADMT (which must not be in generic terms), 
  • that consumers have the right to opt-out of the use of the ADMT, and
  • that consumers have the right to access information about the use of the ADMT

While the pre-notice requirement may not be difficult to comply with, businesses may have difficulty complying with a related obligation that any business that uses a covered ADMT must provide a resource (such as through a layered notice or hyperlink) where consumers can obtain additional information about the business’s general use of ADMT, including an explanation of:

  • the logic used by ADMT,
  • the intended output of ADMT,
  • how the business intends to use the output of ADMT (including any human involvement), and 
  • whether the business’s use of ADMT has been evaluated (and the outcome of any such evaluation)

This looks like a public bias audit or risk assessment, which is similar to transparency obligations found in other laws (such as New York City’s AI Bias law). 

Opt-Out Requirement

The ADMT opt-out requirement shares much in common with the Do Not Sell/Share opt-out requirement under the original CPRA regs. Under the ADMT opt-out requirement, any business that uses a covered ADMT must provide consumers with the ability to opt-out of the ADMT. 

Where a consumer opt-outs, the business must cease processing the consumer’s personal information using that ADMT within 15 business days, and notify all downstream recipients of the personal information to comply with the opt-out with respect to the ADMT. 

The method for submitting ADMT opt-outs appears to be a combination of Do Not Sell/Share and verifiable consumer request methods. Some notable aspects: 

  • A business must offer an interactive form as well as at least one other method for the opt out
  • A business may require verification if the business determines and documents that consumers are more likely than not to be negatively impacted absent verification; however a business may not require verification for opt-outs of profiling for behavioral advertising
  • A business must provide a means by which consumers can confirm the business processed their requests
  • A business must respond to authorized agent requests if the authorized agent provides written permission signed by the consumer
  • A business must offer an ADMT opt-out specific to ADMT requests; however, relying on cookie banners or cookie controls is not sufficient to address this right
  • There is no express obligation to respond to preference signals for ADMT use, such as GPC signals

Access Right Requirement

The access right requirement is similar to the “right to know” requirement under the original CPRA regs. Under the access right requirement, any business that uses a covered ADMT must provide consumers with the ability to request information about the business’s use of ADMT with respect to their personal information. Consumers must verify their identities, and businesses must address verifiable consumer requests within 45 days.

Where a consumer exercises their right, the business shall provide the following:

  • The purpose for which the business used the ADMT,
  • The output of the ADMT with respect to the consumer,
  • How the business used the output with respect to the consumer,
  • If the business plans to use the output to make a decision, a specific explanation regarding that decision,
  • How the ADMT worked with respect to the consumer,
  • A method by which the consumer can obtain a range of possible outputs, which may include aggregate output statistics, 
  • Instructions for how the consumer can exercise their other CPRA rights, and
  • Instructions regarding methods by which the consumer can submit a complaint to the business, the CPPA, and the AG’s Office regarding ADMT

Similar to the ADMT explanation in the pre-use notice requirement, this access right requirement may be difficult for businesses to address. Also, some may argue that this level of required detail goes far beyond the CPRA statutory text and privacy law. I anticipate that there may be legal challenges to some of these requirements. 

Exceptions – Section 7030(m)

Section 7030(m) is one of the most important provisions of the new ADMT regs. This section sets out exceptions to the pre-use notice, opt-out, and access rights requirements. Pursuant to this section, a business is not required to provide consumers with pre-use notice, opt-out, or access rights where the business’s ADMT use is necessary to achieve, and is solely for, the following purposes:

(1) Security: To prevent, detect, and investigate security incidents

(2) Fraud prevention: To resist malicious, deceptive, fraudulent, or illegal actions

(3) Safety: To protect the life and physical safety of consumers

(4) Requested Good or Service: To provide the good or perform the service specifically requested by the consumer, provided that the business has no reasonable alternative method of processing. There is a rebuttable presumption that the business has a reasonable alternative method of processing. 

Notably, there is some ambiguity around the exceptions. For example, Section 7030 states that the opt-out right exception only applies where the ADMT complies with Section 7002 (the reasonable expectation test) while Section 70301 states that the access right exception only applies where the response would compromise the processing for purposes (1)-(3). It is not clear why (or if) these rights and corresponding exceptions should be treated differently. I expect the exceptions to be a topic of discussion during the board meeting. 

Limited Obligations for Service Providers

There is only a single line imposing obligations on service providers. Under the draft ADMT regs, a service provider must provide assistance to the business in responding to verifiable consumer access requests. Of course, other parts of the CPRA regs impose specific obligations on service providers, but it is interesting we did not see more here. 

Special Rules for Children Under 16

The draft ADMT regs also set out some specific rules for children under 16. Where a business has actual knowledge that it profiles a consumer less than 16, it must obtain opt-in consent. For under 13, that consent must be from the parent, and must be separate from the verifiable parental consent required under COPPA. I am surprised that the ADMT regs allow for any profiling of children under 16 (given the robust obligations under the pending Age Appropriate Design Code), and I could see that changing in the final version. 

Submission of Risk Assessments to the CPPA

While not part of the ADMT regs, the revised draft risk assessment regs include new language regarding the process for submitting risk assessments to the CPPA. Per the regs, the first submission is due 24 months from the effective date of the regs, and subsequent risk assessments are due on an annual basis every calendar year. Risk assessments must be submitted through the CPPA’s website. From a practical perspective, I can’t see how the CPPA will be able (or want) to review all these risk assessments. Rather than proactively submit risk assessments, it seems that risk assessments should be provided upon request. 

Tags

privacy, california, cppa, admt, ai, automated, cpra, ccpa