All articles
Analysis

The EU Digital Fairness Act: The Law That Will Ban Dark Patterns Across All Digital Services

Updated 3 April 202610 min read2,400 words

The EU Digital Services Act was a landmark. But it only applies to online platforms. The Unfair Commercial Practices Directive dates from 2005 — before smartphones, before app stores, before dark patterns had a name. The European Commission knows there is a gap. The Digital Fairness Act is how it plans to close it: a single legislative instrument that will ban dark patterns across all digital services, not just platforms.

The Digital Fairness Act (DFA) is the EU's next major consumer protection initiative. The European Commission's public consultation closed in October 2025. A fitness check report followed in Q4 2025. A legislative proposal is expected in late 2026. And the scope is enormous: every business-to-consumer digital service operating in the EU, from SaaS applications and mobile games to streaming platforms and connected devices.

If you thought the DSA's dark pattern rules were significant, the DFA will go further — much further.

Why the DFA Exists: The Gaps in Current Law

The EU does not lack digital regulation. It has the DSA, the GDPR, the Unfair Commercial Practices Directive, and the Consumer Rights Directive. So why does it need another law? Because each of these instruments was designed for a different era or a different purpose, and none of them covers the full landscape of digital manipulation.

DSA Article 25: Platforms Only

The Digital Services Act's Article 25 is the EU's strongest existing dark pattern prohibition. But its scope is limited to “online platforms” — intermediaries that host and disseminate user-generated content. That definition captures social media networks, marketplaces, and app stores. It does not capture SaaS products, direct e-commerce sellers, streaming services, mobile utility apps, online booking platforms, or connected devices. A company selling directly to consumers through its own website, without hosting third-party content, falls outside Article 25 entirely.

UCPD: Drafted Before the Smartphone

The Unfair Commercial Practices Directive (2005/29/EC) is the EU's primary consumer protection framework. It prohibits misleading and aggressive commercial practices. But it was drafted in 2005 — before the iPhone, before app stores, before algorithmic personalisation, before the term “dark pattern” existed. The UCPD's concepts of “misleading actions” and “aggressive practices” can be stretched to cover some digital manipulation, but they were not designed for it. Regulators and courts have struggled to apply 2005-era definitions to 2025-era interface design techniques. There is no explicit prohibition of dark patterns, no taxonomy of digital manipulation, and no framework for addressing addictive design, personalised exploitation, or algorithmic nudging.

Consumer Rights Directive: Pre-Digital Economy

The Consumer Rights Directive (2011/83/EU) covers pre-contract information requirements and cancellation rights. It was a step forward when adopted, but it was designed for an economy where the primary digital transaction was ordering a physical product online. It does not address subscription traps as they exist today, nor does it account for in-app purchases, virtual currencies, loot boxes, or the gamification mechanics that drive modern digital spending.

GDPR: Data Consent, Not Commercial Manipulation

The GDPR addresses dark patterns in the context of data processing consent — deceptive cookie banners, manipulative privacy settings, pre-ticked consent boxes. The European Data Protection Board has published detailed guidance on dark patterns in consent interfaces. But the GDPR's scope is data protection, not commercial fairness. A dark pattern that manipulates a user into purchasing a more expensive subscription or spending money on loot boxes does not engage GDPR unless personal data processing is involved.

The DFA Fills the Gap

The Digital Fairness Act is designed to be the single instrument that closes all of these gaps. One law covering dark patterns across all B2C digital services. One framework addressing not just deceptive design but also addictive design, personalised manipulation, and algorithmic exploitation. The European Commission's fitness check announcement confirmed what regulators, consumer organisations, and researchers had been saying for years: the existing framework is not fit for purpose in the digital age.

What the Digital Fairness Act Will Ban

Based on the Commission's consultation document and fitness check results, the DFA is expected to target several categories of digital manipulation that current law does not adequately address.

Dark Patterns Across All Digital Services

The core prohibition: deceptive and manipulative interface design will be banned across every B2C digital service, not just platforms. This includes the same categories the DSA already prohibits for platforms — overloading, skipping, stirring, obstructing, fickle design, and hidden information — but extended to every digital touchpoint where a business interacts with a consumer.

Addictive Design Techniques

The DFA is expected to address design features specifically engineered to maximise engagement and screen time: infinite scroll, autoplay, push notifications designed to trigger compulsive checking, streak mechanics that penalise users for not returning daily, and variable reward schedules borrowed from slot machine design. These techniques are particularly harmful to minors but affect all users. Current law has no framework for addressing them — the DSA touches on this for platforms, but the DFA will create explicit obligations across all digital services.

Personalised Manipulation

Using personal data to exploit individual vulnerabilities — showing higher prices to users identified as less price-sensitive, targeting urgency cues at users whose behaviour suggests impulsive purchasing tendencies, or adapting dark pattern intensity based on individual user profiles. The GDPR regulates the collection and processing of the data used for personalisation, but it does not regulate the commercial manipulation that personalisation enables. The DFA will.

Subscription Traps

Building on the DSA's provisions and national laws (including the UK's DMCCA subscription regime), the DFA is expected to establish EU-wide rules on subscription transparency, cancellation parity, and renewal notices. The aim is consistency: rather than 27 member states developing different subscription rules, the DFA will create a single standard.

Influencer Marketing Transparency

Clearer rules on commercial content disclosure. The current patchwork of national advertising standards and UCPD-based enforcement has proven inadequate for the scale and speed of influencer marketing. The DFA is expected to establish explicit labelling requirements for paid partnerships, gifted products, and affiliate relationships across all digital services — not just social media platforms.

Virtual Currencies and Loot Boxes

The DFA is expected to address the use of virtual currencies in games and apps to obscure real-money pricing. Loot boxes — randomised reward mechanisms that function like gambling — are a particular target. The practice of converting real money into in-game tokens, then pricing items in tokens rather than currency, makes it deliberately difficult for users to understand what they are actually spending. Several member states have already legislated on loot boxes individually; the DFA would create a harmonised EU-wide framework.

“Pay or Consent” Models

The binary choice between paying for a service or consenting to invasive tracking and advertising. Meta's implementation of this model in the EU is already under DSA investigation. The DFA is expected to establish clearer rules on when “pay or consent” models are acceptable, what constitutes a genuine choice, and what safeguards must be in place. This intersects with GDPR consent requirements but goes beyond data protection into commercial fairness.

From Platforms to All B2C Digital Services

The scope expansion from the DSA to the DFA cannot be overstated. It is the single most significant change the DFA introduces.

The DSA's Article 25 applies only to “online platforms” — intermediaries that host and disseminate user-generated content. That is a specific and relatively narrow definition. The DFA will apply to all business-to-consumer digital services. That includes:

  • SaaS applications — project management tools, CRM systems, accounting software, any cloud-based software product
  • Mobile apps — games, utilities, productivity apps, fitness trackers, weather apps
  • Streaming services — video, music, podcast platforms selling directly to consumers
  • E-commerce — including direct sellers, not just marketplaces. A brand selling through its own website will be covered
  • Digital subscriptions — news outlets, magazines, educational platforms, any recurring digital service
  • Online booking services — travel, hospitality, events, appointments
  • Connected devices and IoT interfaces — smart home devices, wearables, connected appliances with user-facing interfaces

This is a massive expansion. Millions of businesses that are currently outside the DSA's dark pattern rules will fall within the DFA's scope. A small SaaS company with no user-generated content is not an “online platform” under the DSA. Under the DFA, it will be a regulated digital service. A direct-to-consumer e-commerce brand is not hosting third-party content. Under the DFA, its purchasing interface will be subject to dark pattern prohibitions.

For businesses already navigating the Australian Consumer Law or UK DMCCA dark pattern provisions, the pattern is clear: every major jurisdiction is moving toward comprehensive dark pattern regulation. The DFA represents the EU's contribution to what is becoming a global standard.

What the Consultation Revealed

The European Commission ran a public consultation from February to October 2025, alongside a comprehensive fitness check of existing EU consumer law. The results confirmed what regulators already suspected — and provided the evidence base for legislative action.

97% of Websites and Apps Use Dark Patterns

The fitness check found that 97% of websites and apps examined used at least one dark pattern. This is not a marginal problem. It is the default state of digital commerce. The most common patterns identified were hidden information (burying important terms and conditions), pre-selected choices (defaulting to the most commercially advantageous option for the business), nagging (repeated prompts to take an action the user has declined), and false urgency (countdown timers and limited-stock warnings with no basis in reality).

Consumer Organisations vs Industry Groups

The consultation revealed a predictable but instructive divide. Consumer organisations strongly supported explicit dark pattern prohibition, arguing that existing law was too vague, too slow to enforce, and too easy for businesses to circumvent through novel design techniques. They called for prescriptive rules with clear definitions and meaningful penalties.

Industry groups, by contrast, argued that existing law — particularly the UCPD — was sufficient if properly enforced. They warned against prescriptive regulation that might stifle innovation or create compliance burdens for smaller businesses. However, the Commission's own enforcement data undercut this argument: if existing law were sufficient, 97% of digital services would not be using dark patterns.

The Fitness Check Conclusion

The fitness check confirmed that the UCPD, the Consumer Rights Directive, and the Price Indication Directive all need updating for the digital age. The existing framework was designed for a pre-digital economy and does not adequately address dark patterns, addictive design, personalised manipulation, or the specific challenges of digital services. The DFA is the Commission's response to that conclusion.

Timeline: When Will the DFA Become Law?

EU legislative processes are lengthy. The DFA will not appear overnight. But the trajectory is clear and the milestones are already being met.

  • January 2022: European Commission announces a “fitness check” of EU consumer law for the digital age
  • February – October 2025: Public consultation period — stakeholders submit evidence and positions
  • Q4 2025: Fitness check report published, confirming the need for legislative action
  • Q3 – Q4 2026: Legislative proposal expected from the European Commission
  • 2027 – 2028: European Parliament and Council of the EU negotiate the text (trilogue process)
  • ~2028 – 2029: Adoption of the final text
  • ~2029 – 2030: Implementation by member states (typical 24-month transposition period for directives)

This timeline could shift. EU legislative priorities change with political cycles, and the European Parliament elections in 2029 could accelerate or delay the process. But the direction is locked in. The Commission has invested years of groundwork, conducted the fitness check, run the consultation, and built the evidence base. The DFA is not a speculative possibility — it is an active legislative programme with institutional momentum.

For context on how the DFA's timeline and penalty structure compare to other jurisdictions, see our penalties comparison across the UK, EU, and Australia.

Why You Should Prepare Now — Not in 2030

A 2029 or 2030 implementation date sounds distant. It is not. The businesses that will be best positioned when the DFA takes effect are the ones that start preparing now. There are several compelling reasons not to wait.

The Direction Is Clear

The consultation themes tell you exactly what the DFA will target. You do not need the final legislative text to start auditing your digital services against these categories. Review your interfaces for dark patterns. Check your subscription flows for trap patterns. Assess your personalisation systems for manipulation potential. Examine your addictive design features. Review your in-app purchase and virtual currency transparency. The categories are known. The prohibitions are coming. Waiting for the final text to begin preparing is like waiting for the exam paper to start studying.

Many DFA Requirements Will Align with Existing Obligations

Companies that are already compliant with the DSA's dark pattern rules will have a significant head start on DFA compliance. The DFA is expected to build on and extend the DSA's framework, not replace it. Similarly, GDPR-compliant consent mechanisms, fair subscription practices, and transparent pricing already align with the direction the DFA is heading. The gap between “DSA-compliant” and “DFA-compliant” will be smaller than the gap between “non-compliant” and “DFA-compliant.”

Enforcement of Existing Law Is Intensifying

You do not need the DFA to face regulatory action for dark patterns. The DSA is being actively enforced. The UCPD still applies. GDPR enforcement against deceptive consent interfaces continues. The UK's CMA is investigating businesses under the DMCCA right now. Australia's ACCC is pursuing dark pattern cases under Australian Consumer Law. Preparing for the DFA means preparing for the regulatory environment that already exists — just with broader scope.

Audit Against Consultation Themes Now

A practical starting point is to audit your digital services against the DFA consultation themes:

  • Dark pattern categories: Review every user-facing flow for deceptive and manipulative design elements
  • Subscription transparency: Check that sign-up, renewal, and cancellation processes are balanced and clear
  • Personalisation: Assess whether your personalisation systems could be characterised as exploiting individual vulnerabilities
  • Addictive design: Evaluate features designed to maximise engagement — infinite scroll, autoplay, notification design, streak mechanics
  • Virtual currencies: If your service uses in-app currencies, assess whether real-money pricing is transparent
  • Influencer and commercial content: Review disclosure practices for paid partnerships and affiliate content

How TrustScan Helps

TrustScan detects the dark pattern categories the DFA will target. Our scanner analyses your website's interface design against the same manipulation techniques the European Commission identified in its consultation — deceptive urgency, hidden information, misleading default settings, obstruction patterns, and more.

Run a free TrustScan audit to establish a baseline. Identify which dark pattern categories appear in your digital services today, so you can address them before the DFA makes them explicitly illegal. Whether you are a platform already subject to the DSA or a direct-to-consumer business that will first encounter dark pattern regulation through the DFA, knowing your current exposure is the first step toward compliance.

Common Questions

When will the Digital Fairness Act become law?

The European Commission is expected to publish a legislative proposal in Q3 or Q4 of 2026. After that, the proposal must go through the European Parliament and Council of the EU for negotiation (the “trilogue” process), which typically takes 12 to 24 months. Adoption is expected around 2028 or 2029, with member state implementation following over a further 24-month transposition period. Realistically, the DFA is unlikely to be fully enforceable before 2030. However, the consultation themes signal exactly what the DFA will prohibit, and existing laws — the DSA, UCPD, and GDPR — already cover many of the same practices and are being actively enforced today.

Does the DFA replace the DSA?

No. The DFA is designed to complement the DSA, not replace it. The DSA will continue to apply to online platforms with its full set of obligations — dark pattern prohibitions under Article 25, transparency requirements, systemic risk assessments for VLOPs, and more. The DFA extends dark pattern regulation beyond platforms to all B2C digital services. Think of it as closing the gap the DSA intentionally left: the DSA regulates platforms; the DFA will regulate everything else. A business that is both a platform (DSA) and a digital service (DFA) will be subject to both instruments. The Commission has indicated that the DFA will be designed to avoid duplication and ensure coherence with the DSA framework.

Will the DFA apply to non-EU companies?

Almost certainly yes. The DFA is expected to follow the same territorial scope as the DSA and GDPR: it will apply to any business that offers digital services to consumers in the EU, regardless of where the business is established. A US SaaS company with EU customers, an Australian e-commerce brand shipping to Europe, or a UK-based streaming service available in EU member states would all fall within scope. Non-EU businesses will likely be required to appoint a legal representative in the EU, as is already required under the DSA. The extraterritorial reach of EU digital regulation is well-established and the DFA is not expected to deviate from this approach.

What is the expected penalty framework for the DFA?

The DFA's penalty framework has not been finalised, but the Commission's consultation materials and the precedent set by other recent EU legislation provide strong indicators. The DSA allows fines of up to 6% of worldwide annual turnover. The GDPR allows up to 4%. The DFA is likely to adopt a penalty ceiling in a similar range — potentially matching the DSA's 6% maximum — with the possibility of periodic penalty payments for continuing non-compliance. The Commission has consistently signalled that penalties must be “effective, proportionate, and dissuasive,” which in practice means turnover-based fines large enough to deter even the biggest companies. For a detailed comparison of penalty frameworks across the EU, UK, and Australia, see our penalties comparison guide.

Don't wait for enforcement to find you

Run a free TrustScan compliance check and get an ACL-mapped report of your website's dark pattern risk in minutes.

Scan Your Website Free