EU Digital Services Act: The Dark Pattern Rules That Already Apply to Every Online Platform
The EU Digital Services Act isn't coming. It's already here, already being enforced, and already producing fines. Article 25 of the DSA explicitly bans dark patterns on online platforms operating in the European Union, and the European Commission is treating it as a priority. In July 2025, the Commission fined X (formerly Twitter) approximately €550 million — the first major penalty under the DSA's dark pattern provisions. TikTok, AliExpress, Meta, Temu, and Shein are all under formal investigation.
If your platform serves EU users — regardless of where your company is headquartered — the Digital Services Act (Regulation (EU) 2022/2065) applies to you. And unlike many regulatory frameworks, Article 25 spells out exactly what it prohibits: deceptive and manipulative interface design.
Article 25: The EU's Explicit Dark Pattern Ban
Article 25 of the DSA is one of the most direct pieces of dark pattern legislation ever enacted. It doesn't rely on broad consumer protection concepts or require regulators to stretch existing definitions. It targets interface design specifically and by name.
Article 25(1) states: “Providers of online platforms shall not design, organise or operate their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that materially distorts or impairs the ability of the recipients of their service to make free and informed decisions.”
That single sentence covers three distinct categories of harm:
- Deception — interfaces that create false impressions or mislead users about the nature, purpose, or consequences of their actions
- Manipulation — interfaces that exploit cognitive biases, emotional vulnerabilities, or behavioural patterns to steer users toward choices they wouldn't otherwise make
- Material distortion — interfaces that impair users' ability to make free and informed decisions, even without technically deceiving them
Three features make this provision especially powerful. First, it covers the full interface lifecycle — design, organisation, and operation. You cannot defend a manipulative pattern by saying it wasn't “designed” that way if it operates that way in practice. Second, it applies to all online platforms, not just the largest ones. Third, the test is user impact, not platform intent. If the interface materially distorts a user's ability to make free decisions, it violates Article 25 regardless of whether the distortion was deliberate.
Article 25(2) reinforces these prohibitions by referencing the European Data Protection Board's (EDPB) Guidelines 3/2022 on deceptive design patterns. This is significant because it directly incorporates the EDPB's detailed dark pattern taxonomy into enforceable law — providing both platforms and regulators with a shared classification framework for identifying violations.
Article 25(3) makes clear that compliance with GDPR or the ePrivacy Directive does not automatically satisfy Article 25. A platform cannot argue that its cookie consent banner is GDPR-compliant and therefore exempt from the DSA's dark pattern prohibition. The two regimes operate independently. An interface element can be technically GDPR-compliant on its consent mechanics while still constituting a dark pattern under the DSA because of how choices are visually presented, how options are framed, or how the overall flow steers user behaviour.
This layered structure means platforms face a higher compliance bar than many initially assumed. Getting GDPR consent right is necessary but not sufficient. The DSA demands that the interface design itself be fair, transparent, and non-manipulative.
EDPB Guidelines 3/2022 Dark Pattern Taxonomy
The European Data Protection Board published Guidelines 3/2022 on deceptive design patterns, establishing a six-category taxonomy that now serves as the definitional framework for DSA Article 25 enforcement. When the European Commission investigates a platform, its technical teams map interface elements to these categories. Understanding them is essential for compliance.
1. Overloading
Bombarding users with requests, information, or options to push them toward sharing more data or consenting to broader processing than they intended. Examples include continuous prompts to complete a profile, repeated pop-ups requesting consent to additional data uses, and presenting an overwhelming number of granular choices that exhaust the user into accepting the default. The cumulative effect is decision fatigue that benefits the platform.
2. Skipping
Designing interfaces so that users skip past privacy-protective options without realising it. Pre-selecting the most data-invasive settings, defaulting to “Accept All,” or structuring flows so that the privacy-friendly choice requires extra steps while the data-invasive choice is the path of least resistance. Users who don't actively intervene end up sharing far more than they intended.
3. Stirring
Using emotional appeals or visual nudges to influence users' decisions. This includes confirm-shaming — making the opt-out option feel like a bad choice (“No thanks, I don't want to save money”) — using alarming language about what users will “miss out on,” and employing colour psychology to make the data-sharing option feel positive while the protective option feels negative or punitive.
4. Obstructing
Making privacy-protective actions unreasonably difficult. Classic examples: requiring 15 clicks to disable a feature that took one click to enable, burying account deletion behind multiple customer service interactions, or requiring users to navigate labyrinthine settings menus to adjust preferences that were set by a single default during sign-up. The asymmetry between opting in and opting out is the defining characteristic.
5. Fickle
Inconsistent and unclear interface design that prevents users from understanding what controls actually do. Moving the location of privacy settings between visits, using ambiguous toggles where “on” and “off” states aren't visually distinct, or changing the interface frequently enough that users cannot build reliable mental models of how to manage their settings.
6. Left in the Dark
Hiding information, providing incomplete explanations, or using language that obscures what is actually happening. Vague privacy policies, euphemisms for tracking (“personalisation” instead of “behavioural profiling”), and failing to inform users about the consequences of their choices all fall into this category. If a user cannot understand what they are agreeing to, they cannot make a free and informed decision.
These six categories are not academic abstractions. They are the operational framework the European Commission and national Digital Services Coordinators use when assessing whether a platform's interface design violates Article 25. Every investigation begins with mapping interface elements to this taxonomy.
X/Twitter: The First Major DSA Dark Pattern Enforcement
In December 2024, the European Commission issued preliminary findings that X (formerly Twitter) had breached the DSA, including Article 25's dark pattern prohibition. In July 2025, the Commission followed through with a fine of approximately €550 million — the first major penalty under the DSA and a landmark moment for dark pattern enforcement globally.
The central dark pattern issue was X's blue checkmark verification system. Before Elon Musk's acquisition, the blue checkmark indicated that Twitter had independently verified a user's identity. The account was who it claimed to be. After the acquisition, X converted the blue checkmark into a paid subscription feature (X Premium). Anyone willing to pay could obtain a blue checkmark, regardless of whether their identity had been verified.
The Commission found that this constituted a deceptive design pattern. The blue checkmark retained its visual association with verified identity and trustworthiness, but it no longer indicated either. Users encountering a blue-checkmarked account were misled into believing the account had been independently authenticated when it had merely paid for a subscription. This materially distorted users' ability to assess the credibility of information on the platform — precisely the harm Article 25 was designed to prevent.
The significance of this enforcement action extends well beyond X:
- Legacy design associations matter. You cannot repurpose a trust signal for a commercial function and claim users should have understood the meaning changed.
- Dark patterns extend beyond checkout flows. The deceptive design wasn't in a purchase or subscription interface — it was embedded in the platform's fundamental information architecture.
- The Commission will act against the largest platforms. X is one of the world's most prominent platforms. The Commission fined it anyway, establishing that no platform is too large or too politically connected to be held to account.
- The €550 million figure demonstrates the penalty framework in action. This is the DSA's turnover-based enforcement applied in practice, not just in theory.
The Investigation Pipeline
X was the first major enforcement action, but it is far from the last. The European Commission has opened formal proceedings against multiple Very Large Online Platforms, with dark patterns and manipulative design featuring prominently in every case. The pace of enforcement is accelerating.
TikTok (February 2024)
The Commission opened formal proceedings against TikTok in February 2024, investigating addictive design features affecting minors. The investigation examines autoplay mechanisms, notification design, the “rabbit hole” effect of content recommendation systems, and age verification failures. The core question under Article 25 is whether TikTok's interface manipulates users — particularly children — into spending more time on the platform than they would through free, informed choice. TikTok Lite's rewards programme, which incentivised engagement through gamified rewards, was subsequently suspended in the EU.
AliExpress (March 2024)
Formal proceedings were opened against AliExpress (operated by Alibaba Group) in March 2024. The investigation targets illegal product notifications, addictive design patterns, and lack of trader traceability. The dark pattern dimension focuses on how AliExpress designs its marketplace interface to steer purchasing behaviour — urgency cues, manipulative product recommendations, and interface flows designed to impair free decision-making about what and whether to buy.
Meta (May 2024)
The Commission opened proceedings against Meta in May 2024 regarding its “pay or consent” advertising model on Facebook and Instagram. Users are presented with a binary choice: pay for an ad-free experience or consent to personalised advertising based on behavioural tracking. The Commission is examining whether this framing constitutes a dark pattern that materially distorts users' ability to make free choices about their data. The investigation also covers addictive design features targeting children on Instagram, including design elements that exploit psychological vulnerabilities in younger users.
Temu (October 2024)
Proceedings against Temu were opened in October 2024, targeting addictive design, gamification elements, and manipulative purchasing practices. Temu's interface is notable for heavy use of gamification — spinning reward wheels, countdown timers, streak-based incentives, and flash deal mechanics — which the Commission is examining as potential manipulative design under Article 25. The question is whether these features impair users' ability to make rational, informed purchasing decisions.
Shein
Shein was designated as a Very Large Online Platform in April 2025, bringing it under the Commission's direct supervision. The Commission is investigating dark patterns in product recommendations and returns processes. Shein's interface features aggressive urgency cues, gamified purchasing flows, and opaque product sourcing information — all of which place it firmly in the enforcement spotlight for Article 25 violations.
For the latest status of all proceedings, the Commission maintains a public DSA enforcement tracker.
Who Must Comply: VLOPs, VLOSEs, and Everyone Else
A common misconception is that the DSA's dark pattern ban only applies to the largest platforms. It does not. Article 25 applies to all online platforms operating in the EU. The tiered structure of the DSA determines who enforces against you and what additional obligations you face — but the core prohibition is universal.
Very Large Online Platforms (VLOPs)
Platforms with 45 million or more monthly active users in the EU (approximately 10% of the EU population) are designated as VLOPs. The European Commission has designated 19 VLOPs, including Google (Search, Maps, YouTube, Shopping, Play), Meta (Facebook, Instagram), Amazon Marketplace, Apple App Store, X, TikTok, LinkedIn, Pinterest, Snapchat, Booking.com, Wikipedia, AliExpress, Zalando, Temu, and Shein.
VLOPs face the full weight of DSA obligations beyond Article 25:
- Annual systemic risk assessments — including risks arising from manipulative interface design
- Independent audits — annual third-party compliance audits by organisations meeting strict independence criteria
- Data access for researchers — vetted researchers can access platform data to study dark patterns and systemic risks
- Enhanced transparency reporting — detailed public reports on content moderation, recommender systems, and advertising
- Direct European Commission enforcement — the Commission itself investigates and sanctions VLOPs
Very Large Online Search Engines (VLOSEs)
Search engines meeting the same 45 million monthly active user threshold are designated as VLOSEs and face identical additional obligations. Google Search and Bing have been designated as VLOSEs. They are subject to the same systemic risk assessments, independent audits, and direct Commission enforcement as VLOPs.
Regular Online Platforms
Platforms below the 45 million user threshold are still bound by Article 25's dark pattern prohibition. The key difference is enforcement: regular platforms are supervised by national Digital Services Coordinators (DSCs) in each EU member state rather than the European Commission directly. Each member state has appointed a DSC — typically the existing media regulator, telecommunications authority, or a newly created body.
A SaaS product with 50,000 EU users faces the same Article 25 dark pattern prohibition as Google. The enforcement body is different, but the legal standard is identical. National DSCs have the power to investigate, issue compliance orders, and impose penalties within their jurisdictions.
Penalty Framework
The DSA's penalty structure is more aggressive than GDPR — and that is by design.
| Penalty Type | Maximum | Enforcement |
|---|---|---|
| Non-compliance fine | Up to 6% of worldwide annual turnover | EC for VLOPs/VLOSEs; national DSCs for regular platforms |
| Periodic penalty | Up to 5% of average daily worldwide turnover | Accrues for each day of continuing non-compliance |
| Incorrect or misleading information | Up to 1% of worldwide annual turnover | Failure to respond to information requests or providing false data |
For context: GDPR's maximum fine is 4% of worldwide annual turnover. The DSA ceiling of 6% is 50% higher. For a company like Meta, with approximately €135 billion in annual revenue, the theoretical maximum DSA fine exceeds €8 billion. For a mid-size SaaS company with €50 million in revenue, the maximum is €3 million — significant enough to demand board-level attention.
Periodic penalties are particularly punitive. At 5% of average daily turnover, a company earning €100 million annually faces daily penalties of approximately €13,700 for each day of continuing non-compliance. That compounds rapidly.
The Commission can also order interim measures — requiring a platform to change its interface design immediately, before the investigation concludes — and can accept commitments from platforms, making voluntary compliance changes legally binding. X's €550 million fine demonstrates that this framework produces real, material consequences. The penalty represented the DSA's turnover-based approach in action.
How This Affects Non-EU Businesses
The DSA applies to platforms offering services to EU users, regardless of where the company is established. A US, UK, or Australian company operating an online platform that is accessible to EU users is subject to Article 25.
The definition of “online platform” under the DSA is broad: an intermediary service that stores and disseminates information to the public at the request of the recipient of the service. This covers marketplaces, social media platforms, app stores, review sites, booking platforms, and many SaaS products that host user-generated content.
Non-EU platforms must:
- Designate a legal representative in one of the EU member states where they offer services. Failure to appoint a legal representative is itself a DSA breach.
- Comply with Article 25 in the design of all interfaces accessible to EU users. This may require EU-specific interface variants if the global interface contains patterns that violate the DSA.
- Respond to enforcement actions from the relevant authority — the European Commission for VLOPs/VLOSEs, or the national DSC in the member state where the legal representative is established.
- Submit to transparency and reporting obligations applicable to their size tier.
The Commission has already demonstrated its willingness to enforce against non-EU companies. X is US-headquartered. AliExpress is Chinese-owned. Temu and Shein are Chinese-founded. Geography is not a defence.
If your platform also serves UK users, you face a parallel regime under the UK's Digital Markets, Competition and Consumers Act 2024 (DMCCA), which gives the CMA direct fining powers of up to 10% of worldwide turnover. The two regimes are independent — compliance with one does not guarantee compliance with the other, though the underlying dark pattern categories are largely the same.
5-Step DSA Dark Pattern Compliance Checklist
Step 1: Determine Your Obligation Tier
Identify whether your service qualifies as an “online platform” under the DSA. If you store and disseminate user-generated content to the public, you almost certainly qualify. Count your monthly active EU users to determine whether you fall into the VLOP/VLOSE category (45 million+) or the regular platform tier. This determines which authority supervises you and what additional obligations apply beyond Article 25.
Step 2: Audit Every Interface Against the EDPB Taxonomy
Walk through every user-facing flow — registration, onboarding, consent, purchasing, content interaction, account management, subscription lifecycle, and account deletion. Map each design element against the EDPB's six categories: Overloading, Skipping, Stirring, Obstructing, Fickle, and Left in the Dark. If an element falls into any category, it needs to be redesigned or documented with a clear justification for why it does not impair user decision-making.
Step 3: Equalise Accept/Reject Prominence
Review every interface where users make a choice — cookie consent, data sharing, subscription upgrades, notification permissions, marketing opt-ins. Ensure the options to accept and reject have equal visual prominence: same button size, same colour weight, same number of clicks. Asymmetric prominence is one of the most common and most easily detected Article 25 violations.
Step 4: Document Design Decisions
Maintain records of why interface elements are designed the way they are, what alternatives were considered, and how you assessed dark pattern risk. For VLOPs, independent auditors will review these decisions. For all platforms, a DSC may request this documentation during an investigation. Design documentation is your first line of defence in an enforcement action.
Step 5: Implement Continuous Monitoring
A one-time audit is necessary but not sufficient. Dark patterns re-enter platforms through A/B tests, new feature releases, third-party components, and gradual interface evolution. Automated scanning catches patterns that manual review misses — especially those introduced by dynamically generated content, third-party widgets, or A/B test variants optimised for conversion at the expense of user clarity.
How TrustScan Helps
TrustScan's AI-powered compliance scanner detects dark patterns across your website and maps findings to regulatory risk categories. The dark pattern categories TrustScan scans for — subscription traps, drip pricing, confirm-shaming, misdirection, sneaking, false urgency, forced continuity, trick questions, disguised advertising, and nagging — map directly to the EDPB's taxonomy and the specific patterns the European Commission investigates under Article 25.
Whether you are a VLOP preparing for an annual audit, a regular platform ensuring compliance before a national DSC reviews your interfaces, or a non-EU business needing to verify that your EU-facing interfaces meet Article 25 standards, automated detection provides the baseline you need.
Scan your website now to identify dark pattern risks and get a compliance baseline before a regulator does it for you.
For a detailed comparison of how DSA penalties compare to the UK DMCCA and Australian Consumer Law enforcement, see our cross-jurisdictional penalty comparison.
Common Questions
Does the DSA apply to non-EU companies?
Yes. The DSA applies to any online platform that offers services to users in the European Union, regardless of where the company is established. Non-EU platforms must appoint a legal representative in one of the EU member states where they offer services. Failure to appoint a representative is itself a DSA breach. A US, UK, or Australian SaaS company with EU users is subject to Article 25 and must comply with the dark pattern prohibition or face enforcement action from the European Commission (for VLOPs) or the relevant national Digital Services Coordinator.
What is the difference between DSA and GDPR dark pattern rules?
The DSA and GDPR address dark patterns from different angles with significant overlap. DSA Article 25 prohibits deceptive and manipulative interface design across the platform's entire user experience — purchasing flows, content recommendation, account management, and everything in between. GDPR addresses dark patterns specifically in the context of data processing consent — cookie banners, privacy settings, and consent mechanisms. Article 25(3) of the DSA explicitly states that GDPR compliance does not satisfy the DSA's requirements. A single interface element — such as a manipulative cookie consent banner — can breach both regulations simultaneously, and both the European Commission/national DSCs (DSA) and national data protection authorities (GDPR) are actively pursuing enforcement.
What does “materially distort” mean under Article 25?
The concept of “material distortion” in Article 25 draws from the EU Unfair Commercial Practices Directive (2005/29/EC). It means causing — or being likely to cause — users to make decisions they would not have made if the interface had been designed neutrally. The test is whether the design appreciably impairs the user's ability to make an informed decision. Proof that every user was affected is not required — it is sufficient that the design is likely to distort the behaviour of an average user. Critically, intent to deceive is not required. If the design effect is material distortion, the platform is in breach regardless of whether the distortion was deliberate.
What is a VLOP?
A Very Large Online Platform (VLOP) is an online platform with 45 million or more monthly active users in the EU — approximately 10% of the EU's population. VLOPs face enhanced DSA obligations beyond Article 25: annual systemic risk assessments, independent audits, researcher data access, enhanced transparency reporting, and direct enforcement by the European Commission rather than national regulators. The Commission designates VLOPs based on user numbers reported by the platforms themselves. As of 2025, 19 platforms have been designated as VLOPs and 2 as VLOSEs (Very Large Online Search Engines — Google Search and Bing), covering all major social media, marketplace, search, and app store platforms operating in the EU.
Don't wait for enforcement to find you
Run a free TrustScan compliance check and get an ACL-mapped report of your website's dark pattern risk in minutes.
Scan Your Website Free