All Articles

CCPA and AI: Data Protection for Generative AI

California's privacy law applies to AI systems processing personal data. Learn what CCPA compliance requires for generative AI tools.

Tenlines Team10 min read

CCPA Meets Generative AI

When the California Consumer Privacy Act (CCPA) was enacted in 2018, generative AI wasn't on anyone's radar. ChatGPT wouldn't launch for another four years. But the law's broad definitions of "personal information" and "sale" create significant compliance obligations for organizations using AI tools today.

California has nearly 40 million residents. If your AI systems process their data — whether for customer service, hiring, marketing, or internal operations — CCPA applies. And the California Privacy Protection Agency (CPPA) has been actively developing AI-specific regulations that will clarify and expand these obligations.

How CCPA Applies to AI

Personal Information in AI Prompts

CCPA defines "personal information" broadly: any information that identifies, relates to, describes, or could reasonably be linked to a particular consumer or household.

When employees paste customer data into AI prompts, that's personal information processing. Common examples:

  • Customer names and contact details in support queries
  • Employee information in HR analyses
  • Transaction data in financial summaries
  • User behavior data in marketing contexts

According to research, 8.5% of analyzed prompts to AI systems contained potentially sensitive data, including customer information, legal documents, and proprietary code.

Every one of those prompts containing California resident data is a CCPA compliance event.

The "Sale" and "Sharing" Problem

CCPA regulates not just the collection of personal information but its "sale" and "sharing." These terms have technical definitions that may capture AI interactions:

Sale: Exchanging personal information for monetary or other valuable consideration Sharing: Disclosing personal information for cross-context behavioral advertising

When you send customer data to an AI service, does that constitute sharing? If the AI provider uses your inputs for model training, does that constitute sale?

These questions don't have definitive answers yet, but the safest approach treats AI data flows as potential disclosures requiring consumer notice and opt-out rights.

Consumer Rights and AI

CCPA grants California residents specific rights that intersect with AI:

Right to Know: Consumers can request what personal information you've collected and how it's been used. If their data was sent to AI systems, that's disclosable.

Right to Delete: Consumers can request deletion of their personal information. Can you delete data that's been incorporated into AI model weights? Practically, no — which creates compliance complexity.

Right to Opt-Out: Consumers can opt out of the sale or sharing of their information. If AI data flows count as sharing, you need opt-out mechanisms.

Right to Correct: Consumers can request correction of inaccurate information. AI systems that make decisions based on incorrect data may need human review mechanisms.

AI-Specific CCPA Developments

CPPA's Proposed Regulations

The California Privacy Protection Agency has been developing regulations specifically addressing automated decision-making technology (ADMT). Draft rules include:

Access rights for AI profiling: Consumers may request information about how automated systems are used to evaluate them

Opt-out rights for consequential decisions: Consumers may opt out of AI-driven decisions affecting housing, employment, credit, insurance, or education

Pre-use notice requirements: Businesses may need to inform consumers before using ADMT for significant decisions

Risk assessments: High-risk AI uses may require documented risk assessments and submission to regulators

These regulations aren't final, but they signal California's direction: more transparency, more consumer control, more compliance burden.

Enforcement Priorities

The CPPA has indicated AI is an enforcement priority. Key risk areas:

  • Using AI for decisions that significantly affect consumers
  • Processing sensitive personal information without consent
  • Failing to disclose AI use in privacy notices
  • Not honoring opt-out requests for AI processing

Compliance Checklist for AI Under CCPA

1. Map Your AI Data Flows

Document every AI system that processes personal information:

  • What data enters the system?
  • Who are the data subjects?
  • What happens to the data inside the system?
  • Is data retained by the AI provider?
  • Is data used for model training?

Shadow AI is a compliance gap. If employees are using consumer AI tools without approval, you have undocumented personal information processing that violates CCPA's transparency requirements.

67% of AI usage happens via unmanaged personal accounts, completely outside organizational visibility.

2. Update Privacy Notices

Your privacy notice must accurately describe personal information processing. If you're using AI:

Disclose AI categories: "We use artificial intelligence and automated decision-making technologies for [purposes]"

Explain data flows: Where does AI-processed data go? Is it shared with AI service providers?

Address training data: If your inputs might be used to train AI models, disclose this

Note automated decisions: If AI makes or influences significant decisions about consumers, explain this

3. Implement Opt-Out Mechanisms

If AI data flows constitute "sharing," consumers have opt-out rights:

  • "Do Not Share My Personal Information" links
  • Mechanisms to exclude individuals from AI processing
  • Processes to honor Global Privacy Control signals

4. Establish Data Subject Request Processes

When consumers exercise CCPA rights, you need processes to:

Access requests: Identify all AI systems that processed their data; compile records of AI interactions

Deletion requests: Remove data from AI input systems; document limitations on deletion from model weights

Correction requests: Update source data; flag for human review in AI-driven decisions

5. Conduct Risk Assessments

For high-risk AI uses (employment, credit, insurance), document:

  • Purpose and necessity of AI use
  • Data minimization measures
  • Bias testing and mitigation
  • Consumer impact analysis
  • Safeguards implemented

6. Negotiate AI Vendor Contracts

Your AI service providers are "service providers" under CCPA, requiring specific contractual terms:

Processing limitations: Vendors can only process data for specified purposes No sale or sharing: Vendors cannot use your data for their own purposes Compliance assistance: Vendors must help you respond to consumer requests Security requirements: Vendors must implement appropriate safeguards Training data opt-outs: Ensure your inputs aren't used for model training without consent

The Shadow AI CCPA Problem

Here's where shadow AI creates serious compliance risk:

Scenario: An employee pastes customer support tickets containing California resident information into ChatGPT to draft response templates.

CCPA implications:

  • Personal information was collected and used (disclosure required)
  • Personal information was shared with a third party (OpenAI)
  • Consumer wasn't notified of this use
  • No opt-out was offered
  • Your privacy notice doesn't cover this processing
  • You have no records for access requests
  • OpenAI's terms may allow training data use

This single employee action creates multiple CCPA violations — and you might never know it happened.

The solution isn't banning AI. As we've established, bans push usage underground and eliminate visibility. Instead:

  1. Get visibility: Know what AI tools employees are using
  2. Protect data at egress: Strip personal information before it reaches AI services
  3. Maintain audit trails: Log what was sent to AI and when
  4. Enable safe alternatives: Give employees approved AI tools with appropriate protections

How Tenlines Supports CCPA Compliance

Tenlines addresses the CCPA-AI intersection directly:

Data minimization: PII is scrubbed from prompts before reaching AI services, minimizing personal information disclosure

Audit capability: Comprehensive logs support access request responses and compliance documentation

Shadow AI visibility: See what employees are actually sending to AI services

Reduced disclosure scope: When personal information never reaches the AI provider, it's not a "sharing" event requiring opt-out

Vendor simplification: With data scrubbed before transmission, AI vendor contracts are simpler — they're not receiving personal information

Key Takeaways

  1. CCPA applies to AI processing of California residents' data. Broad definitions of personal information capture most business uses.

  2. AI data flows may constitute "sharing." If AI providers receive personal information, opt-out rights may apply.

  3. California is developing AI-specific regulations. Expect expanded requirements for automated decision-making.

  4. Shadow AI creates hidden CCPA violations. Unauthorized AI use means unauthorized personal information processing.

  5. Data minimization is the safest approach. If personal information doesn't reach AI systems, many compliance concerns are eliminated.

Stop data leakage before it starts

Tenlines sits between your team and AI providers, scrubbing sensitive data before it leaves your environment. No workflow changes required.

Join the Waitlist