PH Privacy
CPPA Proposes Amendments to Draft Regulations on ADMTs
May 13, 2025
By Aaron Charfoos,Michelle A. Reedand Kimia Favagehi
On May 1, the California Privacy Protection Agency (CPPA) Board held a meeting to discuss proposed amendments to the CPPA draft regulations on cybersecurity audits, risk assessments and automated decision-making technology (ADMTs). The CPPA has since published these amendments, which consisted of changes to key definitions and requirements. This blog post focuses on the key changes as applicable to ADMTs.
Definitions
The amendments removed the definition for artificial intelligence altogether and narrowed the definition of ADMT to have a greater focus on the general replacement of human decision-making.
Automated Decision-Making Technology |
|
Previous Definition |
Revised Definition |
Any technology that processes personal information and uses computation to execute a decision, replace human decision-making or substantially facilitate human decision-making. |
Any technology that processes personal information and uses computation to replace human decision-making or substantially replace human decision-making. |
The revised definition appears to share certain similarities with the Colorado AI Act, which defines “High-Risk Artificial Intelligence System” as
“any artificial intelligence system that, when deployed, makes, or is a substantial factor in making, a consequential decision”
and the New York City AI Ordinance, which defines “Automated Employment Decision Tool” as
“any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons.”
However, the revised ADMT definition in the CPPA regulations maintains its own definition by focusing primarily on the replacement of human-decision making.
Pre-Use Notice
The amendments also clarified pre-use notice requirements for businesses that use ADMTS. Previously, businesses were required to provide consumers with a pre-use notice to inform consumers about the use of ADMTs. Now, the amendments clarify that the pre-use notice may be provided with notice at collection. However, pre-use notices must clearly state the specific purpose for which the business plans to use ADMTs and not rely on generic terms. Pre-use notices must also provide additional information about how an ADMT makes a significant decision and how a significant decision would be made if a consumer opts out of the ADMT.
Request to Access
The amendments were also revised to clarify what needs to be provided in response to a request to access ADMTs. As a general matter, a business using an ADMT to make significant decisions must provide a consumer with information about use of the ADMT when responding to a request to access ADMT. For example, the business must provide information about the logic of the ADMT, how personal information may have been used to generate an output and the outcome of the decision-making process (e.g., information on whether the output was the sole factor to make the decision, other factors that may have played a role, etc.).
Looking Ahead
The public comment period has now reopened and the regulations may go into effect as early as the end of this year. However, business can start preparing for the regulations now by doing the following:
- Establish AI governance frameworks and internal policies regarding the deployment and development (if applicable) of AI tools;
- Implement accountability measures that emphasize transparency and ethical use of AI, such as establishing a documented approval process for new AI tools;
- Ensure that there are systems and procedures in place to meet the draft regulations’ requirements on pre-use notice and consumer requests, among other things;
- Ensure that any data used in AI tools and systems meets the requirements of data privacy laws, such as the CCPA and other state and international laws; and
- Provide regular employee training on the ethical use of AI and potential bias resulting in the misuse of AI.
The Paul Hastings Data Privacy and Cybersecurity practice is closely monitoring the CPPA draft regulations, as well as other AI updates. If you have any questions, please do not hesitate to contact any member of our team.
Contributors



Practice Areas
Data Privacy and Cybersecurity
Privacy and Cybersecurity Solutions Group
For More Information


