Legal Updates
5 min read

Meta faces legal scrutiny by California, Ireland v. Tiktok & LinkedIn, Hawaii’s legal action against TikTok, Massachusetts v. Meta, BEUC’s report to ban dark patterns and addictive design

Published on
December 12, 2025

Fair Monday is FairPatterns' weekly analysis of regulatory developments, enforcement actions, and dark pattern cases affecting digital trust and consumer protection. Every Monday, we break down complex legal actions to help businesses understand how to build ethical digital experiences. 

We deliver the latest developments in regulatory enforcement, class action lawsuits, and industry accountability, tracking how major platforms are being held responsible for deceptive practices that manipulate user behavior, exploit consumer trust, and undermine digital rights. Whether you're a legal professional, UX designer, compliance officer, or simply a consumer who wants to understand how digital deception works, Fair Monday provides the insights, case analysis, and precedent-setting developments you need to navigate the evolving landscape of digital fairness.

Meta faces legal scrutiny over suppressed mental health research

On November 22, 2025, unsealed court documents in the Northern District of California (Case 4:22-md-03047) revealed that Meta allegedly discontinued internal research after finding causal evidence linking Facebook use to mental health harm. The disclosures concern Project Mercury, a 2020 experimental study, which found that users who deactivated Facebook for one week reported measurably lower levels of depression, anxiety, loneliness, and social comparison behaviors.

What was Project Mercury? 

Designed in late 2019 and launched in 2020, Project Mercury was an experimental deactivation study conducted in partnership with Nielsen. Meta randomly assigned participants to either stop using Facebook and Instagram for one week or continue normal usage, measuring impacts through surveys, platform log data, and smartphone usage tracking. The company invested significant resources—staffing the project exclusively with PhD-level researchers—to address what it viewed as methodological shortcomings in external academic research.

According to plaintiffs (school districts suing Meta alongside TikTok, Google, and Snap) the company dismissed these findings as biased by negative media coverage, despite internal researchers defending the study's validity. Critically, Meta later testified to Congress that it lacked the capability to quantify mental health impacts on teenage users, contradicting its own internal evidence.

The filings further allege that Meta knowingly deprioritized youth safety features that could reduce engagement metrics, allowed users to be flagged up to 17 times for attempted sex trafficking before removal, and that CEO Mark Zuckerberg indicated child safety was not a top priority compared to metaverse development.

This case demonstrates the escalating legal and reputational risks of manipulative design practices. With trials scheduled for 2026, companies face increasing pressure to implement transparent, user-protective design frameworks before regulatory enforcement intensifies.

Hearing scheduled: January 26, 2026

Ireland investigates TikTok and LinkedIn for dark patterns in illegal content reporting

On December 2, 2025, Ireland's media regulator, Coimisiún na Meán, opened formal Digital Services Act (DSA) investigations into TikTok and LinkedIn over concerns that their illegal content reporting mechanisms fail to meet regulatory standards.

The compliance gap: The regulator is examining whether both platforms provide accessible, user-friendly tools for reporting illegal material—specifically, whether users can anonymously report suspected child sexual abuse material (CSAM) as mandated by the DSA. Current systems may fail this fundamental requirement.

The dark pattern allegation: Investigators are assessing whether interface design deliberately misleads users into believing they are reporting illegal content when they are actually only flagging Terms-of-Service violations. This confusion could systematically undermine users' ability to report genuinely unlawful material to authorities, creating a critical gap in child safety protections.

Digital Services Commissioner John Evans stated: "There is reason to suspect that their illegal content reporting mechanisms are not easy to access or user-friendly, do not allow people to report child sexual abuse material anonymously, as required by the DSA, and that the design of their interfaces may deter people from reporting content as illegal."

This marks Ireland's second major DSA enforcement action following last month's investigation into X (formerly Twitter).

Hawaii takes legal action against TikTok: Landmark case for digital compliance

On December 3, 2025, the State of Hawai'i filed a comprehensive lawsuit against ByteDance and TikTok under the Unfair or Deceptive Acts or Practices (UDAP) statute, alleging the platform deliberately designed addictive features to exploit children while misleading families about safety risks.

The complaint documents TikTok's "coercive design tactics" that manipulate dopamine-driven engagement loops, particularly targeting users aged 10-15 when prefrontal cortex development makes children vulnerable to compulsive behavior. Internal documents reveal employees acknowledged minors "do not have executive function to control their screen time," yet the company prioritized "key growth metrics" over safety measures.

Critical violations:

TikTok faces allegations of systematic COPPA violations—its second federal enforcement action—with inadequate age verification allowing children under 13 unfettered platform access. The TikTok LIVE feature allegedly facilitated sexual exploitation through a virtual gift economy, with internal research showing over 100,000 monthly streams hosted by users under 15.

Hawaii seeks civil penalties of $500 to $10,000 per violation per day, plus treble damages. With multiple violations spanning years, potential liability could reach hundreds of millions.

This case joins the federal MDL against Meta, TikTok, Snap, and YouTube, signaling unprecedented regulatory enforcement of digital consumer protection and platform accountability for design-induced harm.

Massachusetts v. Meta at Supreme Judicial Court signals shift in product design liability

On December 5, 2025, the Massachusetts Supreme Judicial Court heard arguments that could redefine liability for digital product design, with justices expressing skepticism toward Meta's constitutional defenses.

Attorney General Andrea Campbell argues Meta deliberately engineered addictive features—incessant notifications, infinite scroll, FOMO-inducing ephemeral content—while concealing internal research showing harm to 320,000+ Massachusetts teen users. Meta claims these design choices constitute protected speech under the First Amendment and are immunized by Section 230.

The court emphasized that features designed to "attract users regardless of content" may fall outside speech protections. Allegations have been successfully framed around interface tools (not content moderation or editorial decisions) arguing that "tens of thousands of Massachusetts’ youth and children have been, and continue to be, mentally and physically harmed by Meta’s intentional, profit-motivated exploitation and manipulation of youth’s vulnerabilities to induce addictive overuse of its Instagram platform.".

Despite internal studies proving "Project Daisy" (hiding like counts) reduced harmful social comparison, Meta refused approving full implementation of Project Daisy on Instagram or any of its platforms and retained the visible “likes” feature for young users. Meta's own research warned "teen brains are much more sensitive to dopamine" yet deployed intermittent variable rewards—identical to slot machine psychology.

The court's focus on design mechanics (not content) creates a framework where AI-powered dark pattern detection and proactive compliance auditing become essential tools for managing legal risk in an evolving regulatory environment where product design choices are no longer shielded by constitutional immunity.

BEUC Report: EU Digital Fairness Act to ban dark patterns and addictive design

On December 2, 2025, the European Consumer Organisation (BEUC) published a comprehensive report revealing that current EU law fails to clearly prohibit manipulative design practices, creating significant enforcement gaps that allow systematic consumer exploitation across digital marketplaces.

BEUC's analysis identifies critical weaknesses in the Unfair Commercial Practices Directive (UCPD), which currently lists only a few specific dark patterns in Annex I while leaving most manipulative interfaces unregulated. The report calls for fundamental reforms including:

General dark patterns prohibition: Expanding UCPD Annex I with comprehensive dark pattern categories, mirroring the Digital Services Act's approach to manipulative interfaces.

Addictive design restrictions: Mandating default-off settings for infinite scroll, autoplay, push notifications, and engagement-based recommendations—particularly for minor users. BEUC emphasizes these features exploit dopamine-driven feedback loops without adequate legal constraints.

Virtual currency and in-app purchase bans: Prohibiting premium virtual currencies (gems, coins, tokens), paid loot boxes, and pay-to-win mechanics that obscure real pricing and encourage overspending.

Personalization limits: Restricting personalised pricing and requiring default-off offer personalisation to prevent tracking-based consumer exploitation.

Influencer marketing: Influencer marketing often blurs commercial intent (content is presented in a way that obscures or hides its advertising nature), especially when directed at children or used for risky products, so the report advocates banning it for certain categories such as unhealthy foods marketed to minors. 

Unfair pricing: Personalised pricing, price obscuration, and manipulative price presentation harm consumers, and BEUC calls for restricting personalised pricing and requiring full price transparency. 

Subscription and contract protections: Mandatory cancellation buttons, right to human contact, and AI contracting safeguards to address automated system manipulation.

Online Reviews: Fake or incentivised reviews mislead consumers, so incentivised reviews should be prohibited to maintain trust in digital marketplaces. 

Resale of event tickets: Excessive markups in secondary ticket markets harm consumers, and BEUC calls for banning the resale of event tickets above face value. 

With the Digital Fairness Act expected in 2026, organizations operating in EU markets face compliance obligations extending beyond GDPR and DSA. BEUC's report establishes the regulatory framework that will govern product design, pricing strategies, and user interface decisions—making proactive dark pattern detection and design audits essential for managing legal risk in Europe's evolving digital consumer protection landscape.

Design liability is escalating globally. Proactive compliance audits transform from optional safeguards into essential risk management for digital platforms.

Don't wait for enforcement. Audit your platform for dark patterns today.

Schedule a FairPatterns compliance assessment to identify design risks before regulators do. Protect your users, your brand, and your bottom line.

Book Your Dark Pattern Audit: https://www.fairpatterns.com/solutions/fairaudit-ai 

References:

Amurabi helped us think out of the box in a very powerful way

Jolling de Pree

Partner at De Brauw

"Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat."

Name Surname

Position, Company name