CLASS Network Students File Comprehensive and Compelling Comment with the California Privacy Protection Agency

February 18, 2025

CLASS Network Students File Comprehensive and Compelling Comment with the California Privacy Protection Agency

The Center congratulates the 32 law and undergraduate students from 7 Consumer Law Advocates, Scholars, and Students (CLASS) Network chapters around the country who contributed to a far-ranging public comment to the California Privacy Protection Agency (CPPA) in the Agency's pending rulemaking on cybersecurity and algorithmic decisionmaking tools (ADMTs). Drawing on their collective expertise in public policy, AI research and development, and cybersecurity, the commenters broadly endorsed the CPPA’s groundbreaking rulemaking while offering sophisticated suggestions for specific language changes to make the final Rule as effective as possible.  

Specifically, the commenters – hailing from the law schools at UC Berkeley, University of Michigan, George Washington University, Georgetown, New York University, Fordham, and the University of Maryland, along with UC Berkeley undergraduates –  offered the following suggestions in support of the CPPA’s overall goal of maximally protecting the privacy rights of all consumers who do business in California:

  • Cybersecurity audits should be performed by fully independent auditors and rooted in nationally recognized cybersecurity standards, much as financial audits are generally performed by external auditors following generally accepted accounting practices (GAAP). The proposed rule allowed for auditors to be internal to the organizations being audited, which simply does not provide the level of independence required to conduct a rigorous audit. 
  • Cybersecurity audits should result in the tabulation and disclosure of numerical key performance indicators for the most common classes of cybersecurity failures. The proposed rule did not expressly require any hard metrics to be collected or disclosed to the CPPA, which is a known problem with cybersecurity auditing that too often results in a lot of vague verbiage and little real accountability.
  • Consumers should be shown clear, up-front disclosures concerning the use of ADMTs and their rights concerning such use before being asked to enter any personal information into a system that incorporates ADMTs. The proposed rule did not require comprehensive disclosure of all uses of ADMTs, and did not require that the disclosure be prominently displayed. Given the long history of companies burying aversive terms deep within their service contracts and Terms of Service, the proposed rule’s disclosure provision was inadequate. 
  • Consumers should be given the unconditional right to opt out of having ADMTs process their personal information, either for decisionmaking or model training. The proposed rule created an opt-out regime with numerous exceptions; for example, as initially proposed, the rule did not give consumers the ability to unconditionally stop a company from feeding their information into models for training purposes. Such exemptions are fundamentally at odds with both the stated purpose of the rulemaking and the European GDPR, with which the proposed Rule is otherwise largely compatible. The commenters pointed out that modest changes would make the proposed Rule similar enough to the GDPR that existing GDPR compliance regimes would automatically satisfy most CPPA compliance requirements, which would both minimize regulated entities’ compliance burdens and significantly strengthen consumer privacy protections.  
  • Companies that deploy ADMTs should be required to perform ongoing parity testing to ensure that ADMTs are not producing outcomes substantially different from the human decisionmaking processes they are supposed to replicate. The proposed rule recognizes the potential risks of algorithmic bias and discrimination, but does not propose any overarching mechanism for monitoring the emergence of those dangerous behaviors. Parity-testing is a well established principle in the tech world, and would allow companies to realize the efficiencies of automation without wholly abandoning the vital safeguards of human judgement.  

In addition to these sophisticated refinements, the comment offered a powerful analysis of how every dimension of this rulemaking falls squarely within the CPPA’s delegated authority, and underscored the need for the Rule by assembling an extensive list of case studies of consumer harm stemming from data breaches and sloppy implementations of ADMTs. The commentators assembled a dozen detailed examples of high-profile data breaches that have compromised the highly sensitive personal information of millions of consumers over the past five years, and two dozen examples of ADMTs currently in use across multiple industries that fail to provide reasonable consumer notice, meaningful ability to opt out, or both.

This submission is a major milestone for the CLASS Network–both because of its substantive importance and because it is the Network’s largest joint undertaking to date. Given the impressive results, the Center is eager to get more projects like this underway in the new future. (Ideas are welcome.)