EU DSA Violations, Microsoft's $160M Australian Case, France's Digital Fairness Proposals, and India's AI Oversight

Welcome to Fair Monday: your weekly briefing on dark patterns, digital consumer protection, and the fight for ethical design.
Every Monday, we deliver the latest developments in regulatory enforcement, class action lawsuits, and industry accountability, tracking how major platforms from Amazon to Meta are being held responsible for deceptive practices that manipulate user behavior, exploit consumer trust, and undermine digital rights. Whether you're a legal professional, UX designer, compliance officer, or simply a consumer who wants to understand how digital deception works, Fair Monday provides the insights, case analysis, and precedent-setting developments you need to navigate the evolving landscape of digital fairness.
The European Commission targets dark patterns: Meta and TikTok face DSA violations
On October 24, 2025, the European Commission issued preliminary findings that TikTok and Meta (Facebook and Instagram) breached transparency and user protection obligations under the Digital Services Act (DSA), marking a significant enforcement action against deceptive interface design practices.
Key violations identified:
1. Researcher data access restrictions: The Commission found all three platforms imposed burdensome procedures that left researchers with incomplete or unreliable public data. This obstruction prevents independent assessment of how users (particularly minors) are exposed to illegal or harmful content, undermining the DSA's core transparency mandate.
2. Dark patterns in notice and action mechanisms: Meta's Facebook and Instagram platforms were specifically cited for using "dark patterns", deceptive interface designs in their content reporting tools. These mechanisms impose unnecessary procedural steps that confuse and discourage users from flagging illegal content, including child sexual abuse material and terrorist content. Such practices render Meta's notice systems ineffective and violate DSA requirements for accessible, user-friendly reporting tools.
3. Ineffective content moderation appeals: The Commission found that Meta's appeal systems fail to allow users to provide explanations or supporting evidence when challenging content removal or account suspensions. This limitation directly contradicts DSA protections guaranteeing users' rights to contest moderation decisions.
Enforcement and penalties: If confirmed, these preliminary findings could result in non-compliance decisions triggering fines up to 6% of total worldwide annual turnover (potentially exceeding €6 billion for Meta) plus periodic penalty payments until full compliance is achieved.
Compliance implications: This case establishes that dark patterns in transparency mechanisms constitute actionable DSA violations. Platforms must implement genuinely accessible reporting and appeal systems free from deceptive design elements. The Commission's explicit identification of "dark patterns" as compliance failures signals intensified regulatory scrutiny of interface design choices that impede user rights or research access.
ACCC v Microsoft: The $160M lesson in subscription transparency
On October 27, 2025, the Australian Competition and Consumer Commission filed Federal Court proceedings against Microsoft Corporation and Microsoft Australia, alleging misleading conduct affecting approximately 2.7 million subscribers.
When Microsoft integrated its Copilot AI assistant into Microsoft 365 Personal and Family plans in October 2024, subscription prices increased significantly—45% for Personal plans ($109 to $159) and 29% for Family plans ($139 to $179). Microsoft allegedly presented subscribers with only two options: accept the price increase or cancel. However, a third option existed—"Classic" plans maintaining original features and pricing without Copilot integration—deliberately concealed within the cancellation flow.
Regulatory allegations:
The ACCC alleges Microsoft violated sections 18 and 29 of the Australian Consumer Law through misleading representations about pricing necessity, service requirements, and available options. "Following a detailed investigation, we will allege in Court that Microsoft deliberately omitted reference to the Classic plans in its communications and concealed their existence until after subscribers initiated the cancellation process to increase the number of consumers on more expensive Copilot-integrated plans," ACCC Chair Gina Cass-Gottlieb stated. The Classic plans were only revealed after subscribers initiated cancellation, constituting what consumer protection experts identify as deceptive choice architecture.
Compliance implications:
This case demonstrates increasing regulatory scrutiny of subscription practices, hidden options, and dark patterns in digital interfaces. Organizations face substantial penalties for non-transparent pricing communications, with maximum fines reaching $50 million or 30% of adjusted turnover. Modern compliance requires proactive detection of potentially misleading user flows before regulatory enforcement.
Arcep proposes binding dark pattern regulations in EU Digital Fairness Act
In October 2025, France's communications regulator Arcep submitted comprehensive recommendations to the European Commission's Digital Fairness Act consultation, establishing unprecedented links between dark patterns, consumer protection, and environmental sustainability in digital services.
Combating addictive design through mandatory controls
Arcep proposed legally binding requirements targeting manipulative interface designs that harm user wellbeing and drive overconsumption. Key measures include disabling autoplay by default for all video, audio, and animated content; regulating infinite scrolling mechanisms and automatic content preloading; and requiring video platforms to offer listening-only modes. These proposals draw from France's General Policy Framework for the Eco-design of Digital Services (RGESN), demonstrating how attention economy regulation directly reduces environmental impact.
Environmental sustainability as consumer protection
Arcep reframed digital sustainability as a consumer welfare issue, proposing eco-design mandates including 7-10 year operating system compatibility requirements to combat software obsolescence, energy-efficient codec requirements for video streaming, and device-adaptive video quality settings. The regulator called for expanded public authority powers to collect environmental data and establish a public EU database tracking the environmental footprint of digital services—empowering consumers to make informed choices.
Preserving internet openness in the AI era
Addressing emerging risks from generative AI, Arcep warned that AI systems delivering single-answer responses may reduce content diversity and user agency compared to traditional search. The regulator is conducting an assessment of AI's impact on internet openness and called for transparency requirements, fair access to AI resources for market entrants, and robust application of the Data Act and Data Governance Act.
Compliance implications: Arcep's proposals signal a regulatory shift treating dark patterns, environmental impact, and market concentration as interconnected consumer protection issues. Organizations should prepare for mandatory eco-design standards and AI transparency requirements as core compliance obligations under future EU digital fairness legislation.
India's ASCI targets dark patterns and AI in digital advertising oversight
As part of its 40th-anniversary initiatives in 2025, the Advertising Standards Council of India (ASCI) outlined its strategic roadmap to strengthen digital advertising oversight under Chairman Sudhanshu Vats. Key priorities include combating dark patterns in gaming interfaces, addressing AI-generated deepfake content in advertising, and enforcing influencer marketing compliance. ASCI is developing an "AI for ASCI" monitoring system incorporating proprietary technology to detect and flag problematic content at scale. The council is expanding its ASCI Academy to educate corporates, agencies, and practitioners on ethical advertising standards. This initiative positions India as a proactive market in regulating deceptive design practices, AI-generated content, and influencer transparency—critical compliance areas for brands operating in digital ecosystems.
References:
- https://ec.europa.eu/commission/presscorner/detail/en/ip_25_2503
- https://www.accc.gov.au/media-release/microsoft-in-court-for-allegedly-misleading-millions-of-australians-over-microsoft-365-subscriptions
- https://www.arcep.fr/uploads/tx_gspublication/Arcep-contribution-EC-public-consultation-DFA_oct2025.pdf
- https://www.financialexpress.com/business/brandwagon-ai-influencers-and-dark-patterns-ascis-roadmap-for-the-next-phase-of-advertising-oversight-4019933/

