Legal Updates
5 min read

EU Directive on Dark Patterns Regulation for Financial Services, $500K Privacy Settlement in California, DFA Reverses Burden of Proof

Published on
November 7, 2025

Fair Monday is FairPatterns' weekly analysis of regulatory developments, enforcement actions, and dark pattern cases affecting digital trust and consumer protection. Every Monday, we break down complex legal actions to help businesses understand how to build ethical digital experiences. 

We deliver the latest developments in regulatory enforcement, class action lawsuits, and industry accountability, tracking how major platforms are being held responsible for deceptive practices that manipulate user behavior, exploit consumer trust, and undermine digital rights. Whether you're a legal professional, UX designer, compliance officer, or simply a consumer who wants to understand how digital deception works, Fair Monday provides the insights, case analysis, and precedent-setting developments you need to navigate the evolving landscape of digital fairness.

EU Directive 2023/2673: New dark pattern regulations for financial services

The European Union has adopted Directive 2023/2673, fundamentally transforming consumer protection requirements for distance financial services contracts. This legislation addresses digital manipulation, dark patterns, and withdrawal rights across all online and remote financial transactions in the EU.

Enforcement Date: June 19, 2026

Mandatory withdrawal function requirement

All businesses offering online financial services—including banking, insurance, credit, investments, pensions, and payment services—must implement a clearly visible withdrawal button on their websites and applications. The button must:

  • Use unambiguous language ("Cancel my contract here")
  • Be permanently accessible during the 14-day withdrawal period
  • Allow completion with minimal consumer information
  • Generate automatic acknowledgment receipts
Dark pattern prohibition

The directive explicitly bans deceptive design practices in financial service interfaces, including:

  • Manipulative choice architecture that steers decisions
  • Repetitive confirmation requests and disruptive pop-ups
  • Asymmetric user experiences (easy sign-up, difficult cancellation)
  • Hidden or obfuscated contract termination processes
Human intervention rights

Consumers interacting with automated systems (chatbots, robo-advisors, AI tools) gain the right to request human assistance during pre-contractual phases and, in justified cases, post-contract.

Two-phase implementation timeline

Phase 1 - National Transposition:

  • December 19, 2025: EU Member States must adopt and publish national laws, regulations, and administrative provisions to comply with the directive
  • Member States must immediately communicate these measures to the European Commission

Phase 2 - Application and Enforcement:

  • June 19, 2026: All transposed measures become applicable and enforceable
  • Organizations must be fully compliant with all requirements

This regulation affects UX designers, product managers, compliance teams, and legal departments across fintech, insurtech, banking platforms, and payment processors. Organizations must audit existing user journeys, redesign cancellation flows, and eliminate manipulative interface elements before June 2026.

$500K privacy settlement in California: When regulators sanction dark patterns without naming them

On October 30, 2025, Sling TV settled with California's Privacy Protection Agency for $500,000. Attorney General Rob Bonta's statement: the company was "not providing consumers an easy way to opt out of the sale of their personal data."

Without expressly using the term "dark patterns," the decision describes and sanctions a very well-known manipulative mechanism: deliberately making privacy rights difficult to exercise to suppress opt-out rates.

This matters because companies can be sanctioned for the outcome—not the label. Regulators care whether users find rights exercise "easy," regardless of what they call the obstruction.

Attorney General Bonta emphasized the importance of the settlement, stating, “Californians have critical privacy rights. Sling TV was not providing consumers an easy way to opt out of the sale of their personal data as required. My office is committed to continued enforcement of the CCPA — every Californian has the right to their online privacy, especially in the comfort of their living room.”

The settlement reveals several manipulative design practices that made exercising privacy rights unnecessarily difficult:

1. Misdirection and irrelevant choices

Sling TV directed consumers seeking to opt out of data sales to cookie preference settings on their website—even when users were accessing the service through the Sling TV app. These cookie controls had no bearing on the app user's actual opt-out request, creating a confusing maze that failed to honor user intent.

2. Unnecessary friction through data collection

Even for logged-in customers whose identities were already known to the company, Sling TV required users to fill out a webform requesting personal information including name, address, phone number, and email address. This created an artificial barrier that discouraged users from exercising their rights—particularly problematic since the company already possessed this information.

3. Platform-specific obstacles

Perhaps most egregiously, Sling TV failed to provide any opt-out mechanism within its living-room device applications (Roku, Apple TV, Amazon Firestick, etc.). This meant consumers watching on their primary viewing platform had no way to exercise their privacy rights without switching to a different device—a clear example of making privacy choices harder than they need to be.

4. Insufficient protection for minors

The investigation found that parental controls were inadequate to protect children and minors (under 16) from targeted advertising and the sale of their personal information. This is particularly concerning given the CCPA's enhanced protections for minors.

CDT Europe exposes critical gaps in dark pattern regulation ahead of Digital Fairness Act

Leading digital rights organization calls for comprehensive framework to address manipulation across AI systems, interfaces, and personalization practices:

On November 4, 2025, the Centre for Democracy and Technology Europe submitted a detailed response to the European Commission's consultation on the forthcoming Digital Fairness Act (DFA), highlighting significant regulatory blind spots that leave consumers vulnerable to manipulative design practices.

The regulatory patchwork problem

CDT Europe's analysis reveals that existing EU legislation only partially addresses dark patterns:

  • EU AI Act: Sets too high a threshold requiring "strong evidence of manipulation and high degree of harm," limiting enforcement against subtle psychological manipulation
  • Digital Services Act (DSA): Prohibits dark patterns but only covers visual interface design, missing AI-driven conversational manipulation
  • General Data Protection Regulation (GDPR): Effective when personal data is involved, but struggles with grey areas where data processing boundaries are unclear
  • The Unfair Commercial Practices Directive (UCPD): Contains a narrow blacklist that doesn't cover emerging manipulative techniques

AI-powered manipulation is evolving faster than regulation. CDT Europe specifically warned about dark patterns emerging through:

  • Chatbots that inject emotional pressure tactics
  • Disguised advertising in conversational AI
  • Personalized pricing exploiting individual vulnerabilities
  • Recommendation algorithms driving addictive engagement

Automated compliance scanning is no longer a "nice-to-have"—it's the only scalable way to document good faith efforts before investigations begin. The question isn't whether your organization will face scrutiny. It's whether you'll have documentation ready when it arrives.

Fairpatterns bridges this gap: automated detection of manipulative design elements across interfaces and AI interactions, helping compliance teams document good faith efforts before investigations begin.

Check Our Solutions more on: https://www.fairpatterns.com/solutions 

References:

Amurabi helped us think out of the box in a very powerful way

Jolling de Pree

Partner at De Brauw

"Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat."

Name Surname

Position, Company name