Sweeping regulations on AI, risk assessments, and cybersecurity audits are about to reshape compliance playbooks across industries.
Context
California has once again set the national pace for AI and privacy regulation. On July 24, the stateâs privacy watchdog unanimously approved new rules that tighten how companies use automated decision-making tools, mandate privacy risk assessments, and require cybersecurity audits. These regulations, adopted under the California Consumer Privacy Act (CCPA), will affect nearly every business handling personal data in Californiaâand theyâre expected to be approved by the stateâs Office of Administrative Law in short order.
Why It Matters
California isnât just regulating AIâitâs operationalizing it. From HR to lending to healthcare, any business relying on algorithms to make âsignificant decisionsâ will need to disclose, justify, and audit its use. The ripple effects go far beyond compliance: these rules will influence national standards for AI transparency, data governance, and consumer trust. Businesses that act early will be better positioned to adaptâand to use compliance as a competitive advantage.
Core Idea
AI is no longer a compliance afterthoughtâitâs a regulated business function.
The new California regulations formalize a three-part framework:
- Automated Decision-Making Technology (ADMT): You must notify and, in some cases, allow consumers or employees to opt out when algorithms influence key life decisions.
- Risk Assessments: You must evaluate and document privacy risks before using sensitive data or deploying high-impact AI systems.
- Cybersecurity Audits: You must conduct annual, evidence-based audits if your data processing presents âsignificant riskâ to consumersâ security.
Key Requirements for Businesses
1. Automated Decision-Making (ADMT)
- Definition: Any technology that replaces or substantially replaces human decision-making using personal data. This includes tools for hiring, promotions, credit decisions, healthcare access, and education admissions.
- Notice Requirements: Before using ADMT, businesses must issue a clear pre-use notice explaining purpose, opt-out rights, data access options, and how human review works.
- Timeline:
- Existing ADMT use: notices due by January 1, 2027.
- New ADMT deployment: notice must precede use.
- Action Items:
â
Inventory all ADMT systems across departments.
â
Draft tailored notices, not generic templates.
â
Establish processes to handle opt-outs and appeals.
2. Risk Assessments
- When Required: Before engaging in activities that pose a significant risk to consumer privacy, such as:
- Selling or sharing personal data.
- Processing sensitive information.
- Using ADMT for significant decisions.
- Profiling based on behavior or location data.
- Key Rule: If you must give a pre-use ADMT notice, you likely need a risk assessment.
- Deadlines:
- Current activities: assessment due by December 31, 2027.
- Ongoing activities: review every three years or within 45 days of a material change.
- Action Items:
â
Map all high-risk data processing.
â
Engage legal counsel to ensure assessments meet CPPA standards.
â
Submit assessments by April 1, 2028, and annually thereafter as required.
3. Cybersecurity Audits
- Whoâs Covered: Businesses posing a significant security risk or meeting certain data and revenue thresholds under the CCPA.
- Audit Triggers:
- More than 50% of annual revenue from selling/sharing personal data.
- Processing 250,000+ consumersâ data or 50,000+ sensitive records.
- Timeline:
- Revenue > $100M (2026): first audit due April 1, 2028.
- Revenue $50Mâ$100M (2027): due April 1, 2029.
- Revenue < $50M (2028): due April 1, 2030.
- Frequency: Annually while thresholds are met.
- Action Items:
â
Determine audit eligibility and prepare early.
â
Conduct a dry run audit to identify documentation gaps.
â
Retain or train qualified auditors to ensure independence.
Closing Thought
California just codified the next era of AI accountability. These rules arenât just about privacyâtheyâre about governance, transparency, and consumer confidence. Businesses that build compliance into their AI strategy now will not only avoid penalties but also lead the market in responsible innovation.