California youth protection battle, gaming addiction lawsuit, X's DSA investigation, and New Zealand consumer report

Fair Monday is FairPatterns' weekly analysis of regulatory developments, enforcement actions, and dark pattern cases affecting digital trust and consumer protection. Every Monday, we break down complex legal actions to help businesses understand how to build ethical digital experiences.
We deliver the latest developments in regulatory enforcement, class action lawsuits, and industry accountability, tracking how major platforms are being held responsible for deceptive practices that manipulate user behavior, exploit consumer trust, and undermine digital rights. Whether you're a legal professional, UX designer, compliance officer, or simply a consumer who wants to understand how digital deception works, Fair Monday provides the insights, case analysis, and precedent-setting developments you need to navigate the evolving landscape of digital fairness.
Tech giants challenge California's algorithmic feed restrictions for minors
On November 13, 2025, Meta, TikTok, Google, and YouTube filed coordinated legal challenges against California's Protecting Our Kids from Social Media Addiction Act, marking a critical test for platform regulation and youth digital safety.
The legislation prohibits social media platforms from providing personalized algorithmic feeds to users aged 13-17 without explicit parental consent. The companies argue these restrictions violate First Amendment protections, claiming algorithmic curation constitutes editorial speech protected under the Supreme Court's Moody v. NetChoice precedent.
Key allegations of Meta, Tiktok, Google and Youtube:
- The law restricts protected speech: The companies argue that personalized feeds are created through editorial decisions which are a form of protected speech under the First Amendment, especially after the Supreme Court’s Moody v. NetChoice decision, which recognized that platforms’ ranking and curation of user content can be expressive activity. By blocking platforms from giving teens aged 13–17 algorithmically curated feeds unless parents consent, the state is allegedly forcing companies to change their editorial choices in a way that violates strict First Amendment standards.
- Minors’ access to information is limited: Google and YouTube argue that the law also burdens teenagers’ rights by preventing them from freely accessing lawful content online unless a parent steps in to approve it.
- There are less restrictive ways to protect kids: The companies point out that parents already have tools to supervise and limit their teens’ online activity. Because these options exist, they claim the state’s across-the-board restrictions are unnecessary and not narrowly tailored.
California's Attorney General counters that the law targets addictive design patterns—including manipulative algorithmic feeds and persistent notifications—that platforms deploy to maximize youth engagement for commercial gain. The state emphasizes the 9th Circuit has signaled that challenges to addictive-feed provisions face significant hurdles.
Compliance implications:
This litigation highlights the growing regulatory scrutiny of dark patterns in youth-targeting digital products. Organizations operating platforms accessible to minors should evaluate their algorithmic recommendation systems, notification strategies, and engagement mechanisms against emerging youth protection standards. The case's outcome will likely influence enforcement approaches across multiple jurisdictions and shape compliance requirements for personalized content delivery systems targeting or accessible to younger users.
Major gaming lawsuit targets "weaponized" dark patterns
On November 7, 2025, a lawsuit filed in the Northern District of California accused Roblox and Minecraft of employing manipulative design patterns that exploit minors. The complaint alleges these platforms function as "weaponized behavioral systems" rather than traditional games, using sophisticated dark patterns including:
- Variable reward loops engineered to trigger dopamine responses similar to gambling
- Time-limited offers creating artificial urgency and FOMO
- Algorithmic targeting identifying vulnerable users for personalized monetization
- Social pressure mechanics compelling continued engagement
- Deceptive marketing presenting addictive systems as family-friendly entertainment
The lawsuit claims developers hired behavioral scientists to refine these techniques, fully aware of children's heightened susceptibility. Both games have faced mounting scrutiny, appearing in a growing number of video game addiction lawsuits.

This case underscores the urgent need for proactive dark pattern detection in digital products. Regulatory bodies worldwide are intensifying enforcement against manipulative design, particularly involving minors. Organizations must implement compliance frameworks that identify and remediate problematic patterns before they trigger legal action.
Ireland investigates X for DSA violations: Internal complaint systems under scrutiny
On November 12, 2025, Ireland's media regulator Coimisiún na Meán launched a formal investigation into X (formerly Twitter) for potential violations of Article 20 of the Digital Services Act. The investigation focuses on whether X provides users with legally required mechanisms to challenge content moderation decisions.
Core compliance issues under investigation:
- Appeal rights: Whether users can effectively contest decisions not to remove content that appears to violate X's terms of service
- Transparency obligations: Whether users receive clear notifications about report outcomes and their right to appeal
- Internal complaint systems: Whether X maintains an accessible, user-friendly complaints-handling mechanism as required for Very Large Online Platforms
Regulators expressed "serious concerns" that X lacks the required effective internal complaint mechanism entirely. If non-compliance is confirmed, X faces fines up to 6% of global turnover or mandatory compliance agreements.
This investigation follows the European Commission's 2024 preliminary findings that X breached DSA obligations related to advertising transparency, data access, and alleged deployment of dark patterns designed to manipulate user behavior.
The takeaway: Internal complaint-handling systems, transparent user notifications, and accessible appeal mechanisms are non-negotiable DSA requirements. Platforms must implement robust compliance frameworks before regulatory intervention forces corrective action.
New Zealand report exposes widespread dark pattern manipulation costing consumers millions
Consumer NZ's November 2025 report "Invisible Influence" exposes how dark patterns have infiltrated New Zealand's digital landscape, systematically manipulating consumer behavior and costing millions in unintended spending.
The comprehensive study examined 10 manipulative design tactics across New Zealand's most popular websites, revealing disturbing prevalence: 93% of consumers encounter scarcity cues, 76% face hidden fees at checkout, and 73% are exposed to disguised advertisements designed to mislead.
Financial and privacy costs:
The financial impact is substantial. One in three New Zealanders spent more than intended due to dark patterns, with average overspending reaching $42 per incident. Nearly half (44%) found it difficult to cancel or undo purchases, while subscription traps—epitomized by HelloFresh's five-step cancellation maze—kept 23% locked into unwanted services.
Privacy violations are equally concerning. One in four consumers shared more personal information than they were comfortable with, while two in five unintentionally agreed to cookies or marketing emails through deceptive interface design.
Unlike the EU, UK, US, and Australia—all implementing robust frameworks against unfair digital practices—New Zealand's Fair Trading Act and Privacy Act lack sufficient provisions to combat dark patterns effectively. The Commerce Commission has pursued minimal enforcement, leaving consumers vulnerable.
Path forward
Consumer NZ recommends urgent regulatory reform including a general ban on unfair trading practices, strengthened privacy protections with civil penalty regimes, increased Fair Trading Act penalties, and clear Commerce Commission guidance for businesses.
With 53% of New Zealanders supporting stricter regulation and $6 billion spent online annually, the message is clear: proactive dark pattern compliance isn't optional—it's an urgent business and regulatory imperative.
These cases demonstrate dark patterns inflict real financial and emotional harm on consumers. Once broken, trust creates lasting reputational damage.
🔍 Request a confidential dark pattern audit – Receive a comprehensive assessment of compliance vulnerabilities with prioritized remediation roadmap.
References:
- https://www.bloomberglaw.com/public/desktop/document/MetaPlatformsIncvBontaDocketNo325cv09792NDCalNov132025CourtDocket?doc_id=X5AH4UINN3N82V9ESOBNJOOPUE7
- https://www.bloomberglaw.com/public/desktop/document/TikTokIncvBontaDocketNo325cv09789NDCalNov132025CourtDocket?doc_id=X32JQTVG5KT881RIE2EQDMO7KU8
- https://www.bloomberglaw.com/public/desktop/document/GoogleLLCetalvBontaDocketNo525cv09795NDCalNov132025CourtDocket?doc_id=X4DBBH3CMGG8K3ARVGBKS5R86E3
- https://www.aboutlawsuits.com/roblox-minecraft-lawsuit-developers-weaponized-games-drive-addiction-excessive-spending/
- https://www.aboutlawsuits.com/wp-content/uploads/20251107_GobleComplaint.pdf
- https://www.cnam.ie/coimisiun-na-mean-investigation-into-x/
- https://www.euronews.com/next/2025/11/13/ireland-launches-investigation-into-elon-musks-x-on-content-moderation
- https://www.consumer.org.nz/articles/online-dark-designs-have-cost-new-zealanders-millions

