Legal Updates
5 min read

Assembly Bill 656, Nevada v. TikTok and FTC's new enforcement philosophy: The compliance landscape transformed

Published on
November 14, 2025
California Assembly Bill 656: Eliminating dark patterns in account deletion

Effective January 1, 2026, California's Assembly Bill 656 introduces mandatory "delete account" functionality for social media platforms generating over $100 million annually, directly addressing dark patterns that obstruct user choice.

Building on the California Consumer Privacy Act (CCPA) and California Privacy Rights Act (CPRA), AB 656 responds to University of Chicago research documenting that over one-third of account deletion attempts fail due to deliberately confusing interfaces, buried options, and misleading "deactivation" alternatives.

Compliance requirements for covered platforms:

Covered businesses must redesign user interfaces to ensure the "Delete Account" button is clearly visible and accessible across mobile, web, and desktop environments. Upon deletion request, platforms must erase all personal information associated with the account, including usage data. The legislation prohibits dark patterns such as multi-step obstacles, confusing terminology, or extended processing delays.

Enforcement and business impact:

The California Privacy Protection Agency (CPPA) will enforce violations under existing privacy law penalties. Non-compliance carries significant regulatory and reputational risks as California continues setting national standards for digital consumer rights. Organizations should conduct immediate gap assessments of current deletion workflows, eliminate manipulative design elements, and implement comprehensive data deletion verification systems to ensure full compliance before the January deadline.

Nevada v. TikTok: A watershed moment for platform design accountability

On November 6, 2025, the Nevada Supreme Court issued a groundbreaking decision that fundamentally shifts how courts evaluate social media platform liability. The ruling allows Nevada's lawsuit against TikTok to proceed, rejecting the company's claims of Section 230 immunity and establishing that platforms can be held accountable for harmful product design—not just third-party content.

The case centers on addictive design features:

Nevada alleges TikTok deliberately engineered its platform to maximize youth addiction through "low-friction variable rewards" (endless scroll, autoplay), social manipulation tools (quantified popularity metrics), ephemeral content, intrusive push notifications, and misleading parental controls. With 49% of Nevadans as active users and $2 billion in annual revenue from young US users alone, the stakes are substantial.

The court distinguished between content moderation (protected) and product design choices (actionable), citing dangerous outcomes including the viral "Blackout Challenge" that encouraged users to asphyxiate themselves on camera.

This precedent confirms that companies face legal exposure for design patterns that exploit user vulnerabilities—particularly among minors. Automated dark pattern detection and proactive compliance audits are no longer optional risk management strategies; they're essential safeguards against regulatory action that could reshape your entire platform.

FTC Commissioner Holyoak outlines 2025 enforcement priorities: What compliance teams need to know

At the November 2025 ANA Masters of Advertising Law Conference, FTC Commissioner Melissa Holyoak signaled a strategic shift in enforcement approach, emphasizing three core priorities: price transparency, children's online privacy (COPPA), and truthful Made in USA claims.

Key compliance takeaways:

The FTC is moving away from ideological labels like "dark patterns" toward fact-based, harm-focused enforcement. Commissioner Holyoak stressed that regulatory action will center on demonstrable consumer harm rather than penalizing data collection practices alone—marking a significant departure from the previous administration's approach.

Critical risk areas:

  • Subscription traps: ROSCA enforcement remains aggressive, targeting unclear cancellation processes and hidden autorenewal terms
  • Hidden fees: The new Unfair or Deceptive Fee Rule expands beyond tickets and lodging—all pricing practices face Section 5 scrutiny
  • Children's data: COPPA violations, especially unauthorized geolocation collection, face heightened enforcement

With enforcement focused on proving actual consumer harm, companies that can demonstrate good-faith compliance efforts through automated detection and documentation are better positioned to avoid penalties. The burden of proof increasingly falls on businesses to show transparent practices before investigations begin—making AI-powered compliance monitoring essential for modern digital products.

The message across jurisdictions is unified: demonstrate proactive compliance before enforcement arrives. From California's delete mandates to Nevada's design liability precedent and FTC's harm-focused approach, regulators reward documented good-faith efforts.

FairPatterns provides the compliance infrastructure this new era demands: automated dark pattern detection across your digital properties, real-time monitoring with audit trails, and documented remediation—proving proactive harm prevention before investigations begin. Transform regulatory risk into competitive advantage.

References:

Amurabi helped us think out of the box in a very powerful way

Jolling de Pree

Partner at De Brauw

"Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat."

Name Surname

Position, Company name