States Push Children's Privacy Laws Forward, Even as Courts Keep Narrowing the Lines
Children's online privacy remains one of the busiest fronts in state regulation. What's changing in 2026 is not the pace of bills but the precision courts are demanding when those bills regulate product design, data use and access to content.
States continue to advance children's online privacy and "age-appropriate design" laws even as industry plaintiffs repeatedly challenge them on First Amendment, vagueness and federal preemption grounds. Most of the major challenges now cluster around two recurring theories: 1) First Amendment arguments that "duty of care," "best interests" or harm-prevention mandates operate as content-based speech restrictions triggering strict scrutiny, and 2) vagueness arguments that use terms like "compulsive usage," "materially detrimental" and "well-being" fail to provide fair notice to businesses.
Recent developments – including South Carolina's Social Media Regulation Act (House Bill 3431), the U.S. Court of Appeals for the Ninth Circuit's mixed ruling narrowing the injunction against California's Age-Appropriate Design Code (AADC) Act, and continued spread of App Store Accountability Acts amid immediate legal headwinds – illustrate the central tension facing companies: They must plan for more state activity and more uncertainty about which provisions will ultimately stick.
South Carolina's Social Media Regulation Act: An Aggressive Design Code Variant Meets Immediate Litigation
South Carolina is the latest state to enact an age-appropriate design code (AADC) law. Although the title targets "social media," the statute reaches a broad set of covered online services and introduces several features that materially increase compliance and enforcement risk: a "reasonable care" duty to prevent specified harms to minors, potential personal exposure for certain officers and employees, mandatory third-party audits with public reporting and an immediate effective date with no compliance runway.
At its core, South Carolina's Social Media Regulation Act (House Bill 3431) requires covered services to exercise "reasonable care" when using a minor's personal data and designing product experiences to prevent specified harms. The statute's listed harms include compulsive usage, severe psychological harm, identity theft, discrimination, and material financial or physical injury.
The law also targets specific "covered design features," including infinite scroll, autoplay, gamification mechanics, visible engagement metrics, push notifications, in-game purchases and appearance-altering filters. Covered services must provide tools to disable certain features; for known minors, the statute requires default settings that maximize privacy and safety.
Litigation followed immediately. On February 9, 2026, in NetChoice v. Bonta, NetChoice filed suit in the U.S. District Court for the District of South Carolina seeking declaratory and injunctive relief, and on March 9, 2026, it moved for a preliminary injunction. The complaint challenges nearly every major feature of the Social Media Regulation Act, arguing that the "reasonable care" standard functions as a content-based speech restriction under the First Amendment and that terms such as "compulsive usage" and "severe emotional distress" are unconstitutionally vague. As of this writing, the case remains pending, and the court has not ruled on the preliminary-injunction motion.
The Ninth Circuit's Message in NetChoice v. Bonta: Courts Will Scrutinize "Best Interests" and Vague Standards
On March 12, 2026, the U.S. Court of Appeals for the Ninth Circuit issued a second decision in NetChoice v. Bonta addressing the California AADC Act. Applying the U.S. Supreme Court's framework in Moody v. NetChoice, the court vacated much of the district court's preliminary injunction, reasoning that the challengers had not adequately accounted for the full range of the law's covered applications (beyond publishers and social media platforms). The court also treated age estimation differently than speech restrictions, concluding that the provision is not facially content-based and therefore does not automatically trigger heightened First Amendment scrutiny.
But the panel left in place an injunction against the act's data use restrictions and dark pattern prohibition on vagueness grounds. In particular, the court found that standards framed around what is "materially detrimental," in a child's "best interests" or tied to "well-being" did not provide constitutionally adequate notice of what conduct is prohibited. For companies that have struggled to operationalize these inherently subjective requirements, that aspect of the decision is a welcome ruling, reinforcing that states must articulate clear, administrable rules before imposing liability for design and data use choices.
App Store Accountability Acts Keep Spreading Despite Challenge
A separate trend is the spread of "app store accountability" bills that place age verification and parental consent obligations on app store providers. On February 17, 2026, Alabama enacted House Bill 161, becoming the fourth state to adopt this model (after Texas, Utah and Louisiana). Effective January 1, 2027, the law requires app store providers to verify users' ages, link minors' accounts to verified parent accounts and obtain parental consent before certain downloads or purchases.
App developers, in turn, must be prepared to receive and act on the "age signal" (and any parental-consent status) provided by the app store. And, as a practical matter, even where these statutes offer developers a good-faith liability shield based on app-store-provided age data, developers may still have independent obligations once they know (or have reason to know) a user is a child.
Courts and challengers have focused on how these laws condition access to content on identity and age verification. For example, a federal court blocked Texas' version in December 2025, concluding the statute likely triggers strict scrutiny because it treats apps differently based on content and imposes a verification gate before any user can download covered apps. Utah's law faces a similar constitutional challenge filed in February 2026, raising First Amendment vagueness arguments.
What Companies Should Do Differently in 2026
In the absence of a uniform federal framework, states will continue to press ahead with children's privacy and data rules at the state level. Companies operating consumer-facing digital services should assume ongoing churn and build programs that manage risk across states without overfitting to any single state's policy.
- Treat Children's Privacy as a Product Design Issue, Not Just a Policy Update. The common thread across these laws – from AADCs to app store accountability – is a focus on how products are built. Companies' risk exposure begins at the moment a product is conceived, making it essential to integrate legal and compliance considerations from the earliest stages of development.
- Expect Litigation Volatility and Build for Flexibility. Companies should resist building bespoke compliance programs for individual laws that may not survive judicial review, focusing instead on flexible, privacy-by-design architectures. Legislative efforts signal a broader trend toward transparency mandates, and companies that can demonstrate documented attention to children's safety will be better positioned regardless of how the litigation resolves.
Children's privacy will remain a state-led arena in the near term, and the litigation trend suggests courts are more likely to trim vague or open-ended standards than to eliminate regulation altogether. The companies best positioned for 2026 and beyond will treat minors' privacy and safety as a cross-functional product discipline, pairing privacy-by-design controls with flexible implementation choices that can adapt as injunctions, amendments and appellate decisions reshape the rules.