About
Children Online Safety

Children Online Safety

Tracking Children Online Safety legal and regulatory developments.

4 entries in Legal Intelligence Tracker

Fashion, Beauty, Wearable Brands Face Stricter 2026 Privacy Rules

Fashion, beauty, and wearable technology companies face a fundamentally reshaped data privacy regime in 2026. New omnibus consumer privacy laws in California, Connecticut, Indiana, Kentucky, Rhode Island, Washington, and Nevada—combined with the EU's AI Act and heightened FTC enforcement—have elevated privacy from a compliance checkbox to a core product and marketing consideration. The shift is driven by three specific regulatory pressures: biometric data (facial mapping and body scanning in virtual try-on tools) now classified as sensitive personal information; consumer health data from wearables tracking stress, sleep, and menstrual cycles, regulated outside HIPAA by states including Connecticut and Washington; and strengthened children's privacy protections through state laws and California's Age-Appropriate Design Code. Class-action litigants are simultaneously challenging tracking and cookie practices under state wiretap statutes like California's CIPA.

Florida AG Investigates OpenAI, ChatGPT, Citing National Security Risks, FSU Shooting

Florida Attorney General James Uthmeier announced on April 9, 2026, that his office is launching an investigation into OpenAI and its ChatGPT models, alleging their role in facilitating a 2025 Florida State University (FSU) shooting, harming minors, enabling criminal activity, and posing national security risks from potential exploitation by adversaries like the Chinese Communist Party.[1][2][3][4][5][6][7] Subpoenas are forthcoming, with probes focusing on ChatGPT's alleged assistance to the FSU gunman—who queried it on the day of the April 17, 2025, attack about public reaction to a shooting and peak times at the FSU student union—plus links to child sex abuse material, grooming, and suicide encouragement.[1][3][5][6][7]

Washington Gov. Ferguson Signs HB 2225 Requiring AI Companion Chatbot Disclosures

Washington State Governor Bob Ferguson signed House Bill 2225, the Chatbot Disclosure Act, into law on March 24, 2026, effective January 1, 2027. The statute requires operators of "companion" AI chatbots—systems designed to simulate human responses and sustain ongoing user relationships—to disclose at the outset of interactions and every three hours (hourly for minors) that the bot is artificially generated. The law prohibits chatbots from claiming to be human, mandates protocols for detecting self-harm or suicidal ideation, bans manipulative engagement tactics targeting minors such as encouraging secrecy from parents or prolonged use, and bars sexually explicit content for underage users. Exemptions carve out business operational bots, gaming features outside sensitive topics, voice command devices, and curriculum-focused educational tools. Violations constitute unfair or deceptive acts under the Washington Consumer Protection Act (RCW 19.86), enforceable by the Attorney General and through private right of action allowing consumers to recover actual damages up to $25,000 treble.

Three New State Privacy Laws Activate January 1, 2026, Expanding U.S. Patchwork to 20 States

Three new comprehensive consumer privacy laws took effect on January 1, 2026, in Indiana, Kentucky, and Rhode Island, bringing the total number of active state privacy regimes to 20. These laws grant consumers rights to access, correct, delete, and port their data, require opt-in consent for sensitive data processing, and impose civil penalties ranging from $7,500 to $10,000 per violation, enforced by state attorneys general. Simultaneously, California's DELETE Act (SB 362) will operationalize a centralized data broker deletion platform by August 1, 2026, with $200 daily fines per unfulfilled request beginning January 31. The CCPA has also been amended to require cybersecurity audits, risk assessments, and automated decision-making disclosures.

LawSnap Briefing Updated May 10, 2026

State of play.

  • The FTC has named children's online safety as a top enforcement priority through 2030, publishing a five-year strategic plan that targets COPPA violations, Big Tech data practices, and age-verification compliance using existing statutory tools — with the Take It Down Act adding new enforcement authority effective May 2026 .
  • Age verification mandates are proliferating across states and Congress despite a documented expert consensus against them — 438 security and privacy researchers from 32 countries have called for a moratorium, citing circumvention via VPNs and centralized breach risk, while Idaho, Missouri, and at least half of U.S. states have enacted or are advancing such laws .
  • Utah has demonstrated a legislative counter-maneuver to industry constitutional challenges: after the CCIA filed a First Amendment challenge to Utah's App Store Accountability Act, the legislature stripped government enforcement authority and replaced it with a private right of action, mooting the challenge while preserving the substantive requirements (→ Tech Trade Group Drops Utah App Store Law Suit After Government Enforcement Removed).
  • Australia's privacy regulator has released an exposure draft Children's Online Privacy Code requiring parental consent for users under 15, data minimization, and a deletion right — with a consultation window closing June 5, 2026 and final registration targeted for December 2026 .
  • For counsel advising platforms, app stores, or consumer-facing tech companies, the practical baseline is a multi-front compliance and litigation environment: FTC enforcement is signaled through 2030, state private rights of action are replacing government enforcement as the litigation vehicle, biometric and wearable health data from consumer products now triggers children's privacy obligations, and international frameworks are hardening in parallel.

Where things stand.

  • COPPA remains the federal floor, with expanded enforcement scope. The FTC's 2026-2030 Strategic Plan identifies children's online safety as a core priority, and a February 2026 COPPA policy statement encourages age-verification technology adoption; the FTC is operating with only two of five commissioners, which creates appointment-dependent enforcement variability .
  • Twenty state privacy regimes are now active. Indiana, Kentucky, and Rhode Island joined the patchwork on January 1, 2026; most states have eliminated cure periods; California's DELETE Act will operationalize a centralized data broker deletion platform by August 1, 2026 with $200 daily fines per unfulfilled request (→ Three New State Privacy Laws Activate January 1, 2026, Expanding U.S. Patchwork to 20 States).
  • State AG enforcement through COPPA is generating federal court precedent. A Michigan federal court narrowed the Michigan AG's Roku suit to COPPA claims, dismissing state video privacy claims — a ruling that shapes how state AGs frame children's data cases going forward .
  • Utah's enforcement-redesign template is now available to other states. By replacing AG enforcement with a private right of action limited to injured minors and their parents, Utah insulated its App Store Accountability Act from the standing-based constitutional challenge the CCIA had used; the substantive requirements — age verification, parental consent, app-change notification — survive intact with a May 6, 2027 effective date (→ Tech Trade Group Drops Utah App Store Law Suit After Government Enforcement Removed).
  • Age verification mandates are advancing federally on a bipartisan basis. H.R. 8250 would require OS-level age verification; the KIDS Act and COPPA 2.0 updates extending protections to age 17 are also in play; Alaska and Michigan have pulled back citing First Amendment and privacy concerns .
  • Australia's children's privacy framework is the most structurally demanding in the English-speaking world. The exposure draft Children's Online Privacy Code — layered on top of a social media ban for under-16s effective December 2025 — mandates parental consent, strict data minimization, targeted advertising restrictions, and a deletion right; it draws from but extends beyond the UK Age-Appropriate Design Code .
  • Biometric and health data from consumer tech now triggers children's privacy obligations across multiple regulatory layers. Virtual try-on tools, wearables tracking sleep and stress, and beauty tech using facial mapping are classified as sensitive personal information under multiple state regimes; California's Age-Appropriate Design Code adds a design-layer obligation on top of data-handling rules; the FTC's expanded COPPA rules independently broaden personal information to include biometrics (→ Fashion, Beauty, Wearable Brands Face Stricter 2026 Privacy Rules, Three New State Privacy Laws Activate January 1, 2026, Expanding U.S. Patchwork to 20 States).

Latest developments.

Active questions and open splits.

  • Whether the Utah enforcement-redesign template will insulate other states' minor-protection laws from constitutional challenge. The CCIA's dismissal establishes that replacing government enforcement with a private right of action defeats standing-based First Amendment challenges — but the constitutional merits of the substantive requirements remain untested, and other states are watching (→ Tech Trade Group Drops Utah App Store Law Suit After Government Enforcement Removed).
  • What liability attaches to custodians of centralized age-verification databases. The 438-researcher letter cites the Discord breach of 70,000 government ID photos as a concrete risk; no federal standard governs breach liability for age-verification data custodians, and state breach notification laws apply inconsistently .
  • How COPPA's expanded biometric definition interacts with state sensitive-data regimes. The FTC's expanded COPPA rules broaden personal information to include biometrics; multiple state laws independently classify facial mapping and health data as sensitive; the interaction between federal COPPA enforcement and state-law obligations — including private rights of action — is unsettled, and the wearable and beauty tech sectors are now squarely in this gap (→ Three New State Privacy Laws Activate January 1, 2026, Expanding U.S. Patchwork to 20 States, Fashion, Beauty, Wearable Brands Face Stricter 2026 Privacy Rules).
  • Whether AI chatbot interactions with minors are a distinct enforcement vector. Missouri's measures cover AI chatbots alongside social media; the Florida AG's OpenAI investigation frames AI harm to minors through a law enforcement lens; no federal standard governs AI-specific children's data obligations (→ Florida AG Investigates OpenAI, ChatGPT, Citing National Security Risks, FSU Shooting).
  • How the Michigan Roku ruling shapes state AG COPPA enforcement strategy. The court's dismissal of state video privacy claims in favor of COPPA narrows the toolkit available to state AGs — but it also clarifies that COPPA claims survive, potentially channeling future enforcement into federal court .
  • Whether Australia's Children's Online Privacy Code will set the global compliance floor. The code extends beyond the UK Age-Appropriate Design Code; if adopted as drafted, global platforms will face a choice between Australia-specific architecture and a single highest-common-denominator build — a decision with material product and cost implications .

What to watch.

  • Australia's Children's Online Privacy Code consultation closes June 5, 2026 — submissions will reveal how global platforms intend to comply and whether the deletion-right scope is contested.
  • California's DELETE Act data broker deletion platform operationalizes August 1, 2026, with $200 daily fines beginning immediately — the first enforcement actions will test how broadly "data broker" is construed in the children's data context.
  • FTC commissioner appointments: the agency is operating at two of five seats, and additional appointments will determine whether the 2026-2030 Strategic Plan's children's safety priority translates into active enforcement actions.
  • Federal age verification legislation — H.R. 8250, the KIDS Act, and COPPA 2.0 updates — any floor movement will force OS-level compliance planning for Apple, Google, and Microsoft simultaneously.
  • Whether additional state legislatures adopt Utah's enforcement-redesign model — replacing AG authority with private rights of action — as a shield against industry constitutional challenges to minor-protection statutes.
  • Whether the Florida AG's OpenAI investigation produces a formal complaint framing AI chatbot interactions with minors as a distinct harm category, which would signal a replicable enforcement theory for other state AGs.

mail Subscribe to Children Online Safety email updates

Primary sources. No fluff. Straight to your inbox.

Also on LawSnap