Table of Contents

California’s companion-chatbot bills cleared back-to-back committee hearings during the week of April 20, 2026, advancing the first US framework to require annual independent audits of minor-facing AI companions filed directly with the state Attorney General. If both bills pass and are signed, shipping a Replika- or Character.AI-style product to California minors will carry mandatory third-party review, hard session limits, and per-child civil liability before AB 2023’s risk-assessment deadline even arrives.

What Just Happened: The Committee Votes

[SB 1119 (Padilla)]1 cleared the Senate Privacy Committee 7-0 and the Senate Judiciary Committee 13-0. [AB 2023 (Wicks)]2 cleared the Assembly Privacy and Consumer Protection Committee 13-2. Both now advance to floor votes in their respective chambers, according to [coverage from the Sierra Sun Times]3.

Maria Raine testified that her son Adam had died following interactions with a chatbot that, [per the same reporting]3, mentioned suicide 1,275 times across their conversations. The bills extend [SB 243 (Padilla, 2025)]4, which was the first US statute to mandate chatbot safety protocols for minors at all.

What the Bills Actually Require: Audits, Caps, and Parental Controls

The two bills are companion legislation, not duplicates, and conflating them creates compliance blind spots.

[SB 1119]1 owns the audit-and-oversight framework. Operators must commission annual independent audits assessing child-safety compliance; the auditor then files an AI child safety audit report with the Attorney General within 90 days. Reports are confidential by default, with carve-outs for enforcement, qualified researchers, and child-safety organizations.

The same bill encodes the default minor-mode settings. Out of the box, a minor’s account must operate in ephemeral mode. Persistent memory requires parental consent. Push notifications are blocked from midnight to 6 a.m. and from 8 a.m. to 3 p.m. on weekdays. Single conversations cap at one hour; daily use caps at two hours. If a substantial risk to a minor is detected, parents must be notified within 24 hours.

[AB 2023]2 carries the money. It bans targeted advertising and data sales to users under 18 and sets the risk-assessment implementation deadline at July 1, 2027. The civil penalty schedule lives here, not in SB 1119. Penalties are $5,000 per child for negligent violations and $15,000 per child for intentional violations2.

The partition is not clean, however: AB 2023 also contains mandatory independent audits and identical time-limit provisions. A vendor treating the bills as separate compliance checklists will miss overlapping obligations.

The Enforcement Stick: AG Reports, Civil Penalties, and Private Lawsuits

It also creates a private right of action: children or parents can sue directly for actual damages, punitive damages, and attorney fees.

A product with tens of thousands of minor users in California that runs afoul of the advertising prohibition doesn’t face a single fine; it faces liability calculated against each affected user. That arithmetic changes the risk calculus for any vendor that has not already removed minors from its product entirely.

Industry Pushback: CalChamber and TechNet Objections

[The California Chamber of Commerce, TechNet, and a coalition of trade groups]4 opposed the bills, urging further clarification and warning that vague standards and broad enforcement could raise compliance costs and chill innovation. Their objections cluster around three arguments.

The standards are vague in practice. The bills do define terms like “psychological harm” and “excessively sycophantic,” but a vendor trying to spec an annual audit against those definitions has no tested baseline for what passes. Industry opponents are not wrong that enforcement risk is real while enforcement contours remain untested.

Compliance costs scale badly for smaller operators. Third-party audits are not cheap, and an annual cadence filed with the AG creates recurring legal and operational overhead that enterprise players can absorb but startups cannot. A product category built on low-friction consumer subscription pricing will need to reprice or exit the minor demographic before the AG receives the first report.

Potential overlap with [SB 243]4 is the third argument. Operators now face three overlapping minor-facing chatbot regimes before any of them has been tested in enforcement. Whether the legislative sponsors will harmonize the bills before the floor vote is unclear.

What It Means for Vendors: Compliance Cost and Geofencing Reality

Character.AI announced in October 2025 that it would bar users under 18, with the restriction taking effect November 25, 20255. That decision now looks less like a values statement and more like a compliance hedge. A product that has already exited the minor demographic is not the target. A product still serving minors is squarely in scope, including any general-purpose companion platform that permits minor account creation.

The geofencing problem is the harder one. California has roughly 39 million residents including several million minors, and “California users” is not a clean filter for most mobile products. Consumer apps cannot easily verify age at signup with sufficient reliability to establish a legal defense. The practical choice is binary: build the SB 1119 and AB 2023 compliance stack and apply it nationally, or wall off minors entirely. Both carry cost; neither is free.

Procurement of a qualified independent auditor is not a weekend task. There is no established market of AI child-safety auditors the way there is a market of SOC 2 or PCI auditors. Vendors who want first-mover advantage will be competing for a small pool of firms with appropriate expertise, and scarcity will inflate costs for whoever waits for the floor vote before starting conversations.

The Federal Gap: State Law Filling a Vacuum Left by Washington

With federal AI safety guidance rolled back under the current administration, Congress has not passed a federal framework for AI and minors; what exists at the federal level is COPPA, which governs data collection from under-13 users but says nothing about AI companion interactions, crisis-response protocols, or audit requirements.

This produces the 50-state patchwork problem in practice. A vendor claiming nationwide coverage must track whatever other states advance companion-chatbot legislation, because the absence of federal preemption means each state can impose its own audit standards, its own penalty schedules, and its own private right of action. California’s AG-audit model is one architecture; another state could mandate public disclosure, a different auditor accreditation regime, or a stricter age cutoff.

California’s market weight makes compliance with SB 1119 and AB 2023 effectively mandatory for any product that wants a meaningful US user base. A vendor that engineers full compliance will likely apply it nationally rather than maintain California-specific product logic. That produces a de facto national standard by cost-of-differentiation rather than by federal law, which is how California has historically exported its regulatory floor on everything from vehicle emissions to data privacy.

Frequently Asked Questions

How do SB 1119’s audit requirements go beyond what SB 243 already mandated?

SB 243 (2025) required operators to implement safety protocols for minor-facing chatbots but provided no mechanism for the state to verify compliance. SB 1119 layers a mandatory annual independent audit filed with the Attorney General on top — turning self-certification into third-party inspection. SB 243 set the rules; SB 1119 creates the enforcement eyes.

Does a platform that already complies with COPPA need to do anything different?

Yes, because the two regimes cover different age bands and different harms. COPPA, enforced by the FTC, governs data collection from children under 13 and requires verifiable parental consent — but it says nothing about chatbot interaction design, session limits, or crisis-response protocols. SB 1119 and AB 2023 cover users under 18 and target companion-AI behavior, not just data handling. A vendor must satisfy both simultaneously, with COPPA’s parental-consent requirements for under-13 data layered underneath the new bills’ audit and session-cap obligations for the full under-18 population.

How would an auditor test for ‘excessively sycophantic’ chatbot behavior?

The bills define the term but specify no measurement methodology, and no industry standard exists for quantifying AI sycophancy. An auditor would need to construct adversarial prompt suites — scenarios designed to elicit agreeable or affirming responses — and benchmark the model’s replies against the statutory language, effectively inventing the compliance test while performing it. This is why even proponents acknowledge the first audit cycle will be more precedent-setting than routine.

Can a vendor that blocks under-18 signups still face SB 1119 audit obligations?

Potentially, if the Attorney General deems the vendor’s age-gating mechanism insufficient. The bills do not define what constitutes adequate age verification, so a platform relying on self-reported dates of birth — a method known to be circumventable — could be treated as minor-facing and pulled into the audit regime. Character.AI’s hard ban on under-18 accounts, rather than a softer age gate, now reads as a deliberate legal hedge against exactly this exposure.

What’s the earliest a vendor could face both the SB 1119 audit deadline and AB 2023’s risk-assessment deadline simultaneously?

SB 1119’s audit clock starts 180 days after the AG finalizes implementing regulations under Section 22615, while AB 2023’s risk-assessment deadline is fixed at July 1, 2027. If the bills are signed by early fall 2026 and the AG moves on a standard regulatory timeline, both deadlines could land in mid-2027 — meaning a vendor could owe its first filed audit and its first completed risk assessment in the same quarter, with two independent compliance teams competing for the same engineering bandwidth.

Footnotes

  1. SB 1119 (Padilla) 2 3

  2. AB 2023 (Wicks) 2 3

  3. Sierra Sun Times coverage 2

  4. Sen. Padilla seeks new rules for child-facing chatbots after testimony on deadly interactions 2 3

  5. Character.ai - Wikipedia

Sources

  1. SB 1119 (Padilla)primaryaccessed 2026-04-29
  2. AB 2023 (Wicks)primaryaccessed 2026-04-29
  3. Sierra Sun Times coverageprimaryaccessed 2026-04-29
  4. Sen. Padilla seeks new rules for child-facing chatbots after testimony on deadly interactionsanalysisaccessed 2026-04-29

Enjoyed this article?

Stay updated with our latest insights on AI and technology.