The rise of dark patterns: does competition law make it any brighter?
Lirio Barros Oxera Consulting LLP, Brussels, Belgium

Search for other papers by Lirio Barros in
Current site
Google Scholar
PubMed
Close
,
Timo Klein Oxera Consulting LLP, Amsterdam
Utrecht University School of Economics, the Netherlands

Search for other papers by Timo Klein in
Current site
Google Scholar
PubMed
Close
,
Anastasia Shchepetova Oxera Consulting LLP, Paris, France

Search for other papers by Anastasia Shchepetova in
Current site
Google Scholar
PubMed
Close
, and
Tim Hogg Oxera Consulting LLP, London, UK

Search for other papers by Tim Hogg in
Current site
Google Scholar
PubMed
Close
Full access

Dark patterns are deceptive online interface designs that may nudge users into making decisions that are in the interest of the online business at the expense of the user. This article considers the economics behind dark patterns: what are they, what can economics teach us regarding how they work, how is digitalization changing the economics behind dark patterns and to what extent can competition and consumer protection laws solve the problems that arise?

Abstract

Dark patterns are deceptive online interface designs that may nudge users into making decisions that are in the interest of the online business at the expense of the user. This article considers the economics behind dark patterns: what are they, what can economics teach us regarding how they work, how is digitalization changing the economics behind dark patterns and to what extent can competition and consumer protection laws solve the problems that arise?

1 Introduction

Consumer protection in the digital single market is one of the main priorities of European policymakers. On 13 November 2020, the European Commission (Commission) launched the New Consumer Agenda, which presents a vision for EU consumer policy between 2020 and 2025. In particular, the Commission aims to tackle online commercial practices that disregard the right of consumers to make an informed choice, abuse their behavioural biases or distort their decision-making processes ‒ referring specifically to concepts such as dark patterns1 and hidden advertising.2 The Commission committed to gathering evidence on emerging unfair digital commercial practices to update its guidelines and to assess whether additional action is needed at the EU level.

For a few years now, national authorities in European countries have also identified the digital space as a focus area in terms of consumer protection. For example, on 31 October 2022 the Netherlands Authority for Consumers and Markets (Autoriteit Consument en Markt (ACM)) updated its 2020 guidelines for the protection of online consumers3 and in July 2020 the Competition and Markets Authority (CMA) completed its study into the role of ‘choice architecture’ in inhibiting effective consumer decision-making on online platforms and in digital advertising.4

Similarly, in October 2021, the US Federal Trade Commission (FTC) announced plans to increase enforcement against ‘illegal dark patterns’ that mislead consumers into buying subscriptions.5 This policy statement comes shortly after the FTC announced that curbing ‘deceptive and manipulative conduct on the internet’ was to be one of its eight enforcement priority areas over the next 10 years.6

Much of the discussion around deceptive and manipulative online conduct revolves around ‘dark patterns’. We define dark patterns simply as deceptive online interface designs that are used to trick people into making decisions that are in the interests of the online business, but at the expense of the user.7 These designs may pertain to websites, but also to games or apps.

Dark patterns may be especially potent given the way that consumers interact with online choice architecture. For example, in an online setting, many people tend to multi-task between devices, apps, windows, tabs, etc. This limits their ability to concentrate on a single task for an extended period of time.8

Consumers respond to the volume of online information and the distracting qualities of online information with strategies to extract the most relevant information in the shortest amount of time. We are constantly forced to make choices about what is worth our attention, and how much attention to give.9

The following are recognized examples of dark patterns:10

  • Preselection ‒ pushing consumers to consent to certain choices (such as privacy-intrusive settings) because of unclear or burdensome preselection or usage of defaults.

  • Urgency messaging ‒ where an online retailer creates a (possibly false) sense of scarcity that urges consumers to buy immediately.

  • Hidden information or false hierarchy ‒ the use of visuals or language to steer users towards a particular choice, which can include exploitative default settings or ‘nagging’ or ‘confirmshaming’ users to opt into something.

  • ‘Drip pricing’ ‒ additional surcharges that become clear only once a consumer is about to pay for the selected product.

  • Forced registration ‒ forcing the user to register an account before they can make use of the service.

  • ‘Bait-and-switch’ ‒ where an online retailer lures users onto its website with unique, high quality or cheap products that the business knows are not actually in stock, in order to sell alternatives.

  • Hidden advertisements ‒ for example, a social media advertisement disguised as a regular social media post.

  • Difficult cancellations ‒ for example, an enticing free trial subscription that then automatically continues into a paid subscription that is complicated to cancel. This dark pattern is also known as a ‘Roach Motel’, which comes from the eponymous American cockroach bait device, with the strapline ‘Roaches check in, but they don’t check out!’.11

  • ‘Nagging’ ‒ ‘guilting’ users into opting into something, e.g. by wording the option to decline in such a way that it shames the user into not doing so.

Figure 1 provides a stylized illustration of four different types of dark pattern: pre-selection, difficult cancellation, urgency messaging and drip pricing.

Figure 1
Figure 1

Illustrative dark patterns

Citation: Competition Law Journal 21, 3; 10.4337/clj.2022.03.06

Source: Oxera.Note: These are stylized examples of preselection (bottom left), difficult cancellations (bottom right), excessive urgency messaging (top left), and drip pricing (top right).

So exactly how prevalent are dark patterns? The Commission examined this in a recent study through a review of the top websites and apps at the EU level. It carried out a mystery shopping exercise, through which consumers from across the EU were recruited to assess the use of manipulative commercial practices.

The results revealed that dark patterns were highly widespread: 97% of the most popular websites and apps used by EU consumers deployed at least one dark pattern. Moreover, dark patterns were rarely found in isolation, often with more than one acting simultaneously.12

The Commission’s study found that dark patterns were common not just on the websites of large, international retailers but also on the websites of smaller and national merchants. The most widespread forms of dark pattern varied with the type of website. Overall, the five most common types of dark pattern were found to be: (1) hidden information/false hierarchy, (2) preselection, (3) nagging, (4) difficult cancellations, and (5) forced registration.

Dark patterns were equally prevalent across websites and mobile apps and across EU-based and non-EU-based retailers. The Commission’s study also identified instances of data-driven personalized practices, which are considered more harmful than classic dark patterns as they target individual consumers’ specific needs and vulnerabilities.

In an effort to go beyond definitions and inform the debate, this article asks the question: ‘what do we actually know of the economics behind dark patterns?’

It first discusses how behavioural economics can help us to understand the workings of dark patterns (the ‘old’), and how digitalization has changed the costs and benefits behind their implementation and exploitation (the ‘new’). It ends with a discussion of how economics can inform ongoing policy discussions on appropriate regulatory responses, and how it can help to assess the effects of the use of dark patterns.

2 What’s old: the behavioural economics behind dark patterns

Traditional economics assumes that, as humans, we use all available information before us and process this in a purely rational way in order to make optimal decisions. However, in reality there are limits to our ability to do so. Behavioural economics increases the explanatory power of economics by providing it with more realistic psychological foundations.13

To provide a more accurate description of human problem-solving capabilities, behavioural economists coined the term ‘bounded rationality’. Since an individual’s brainpower and time are limited, humans cannot be expected to solve difficult problems ‘optimally’; rather, people adopt rules of thumb as a way of using their effort and time efficiently.14 However, such ‘heuristics’ (or even relying on pure instinct) can lead to systematic and predictable errors in situations of uncertainty.15 These errors can also be referred to as behavioural (or cognitive) biases.16

Behavioural economics tells us that ‒ because of these biases ‒ the way in which information or choices are presented can have a significant impact on the decisions that individuals make.17 This is discussed in more detail in Box 1.

Cognitive biases

Behavioural economists have identified a wide range of biases. Some of the most relevant in the context of dark patterns are the following.

Default bias is the tendency of people to disproportionately stick with the status quo. For example, subjects in experimental studies are frequently found to stick to default choices more frequently than would be predicted by standard economics.18 The effect is visible in many important decisions ‒ for example, in the selection of health plans and retirement options.

Scarcity bias is the tendency of people to place a higher value on things that seem scarce.19 Some websites make use of this bias by displaying countdown timers or limited-time messages to create an excessive sense of urgency.

The social norm effect is the tendency of people to value something more because others seem to value it ‒ or, more generally, to simply follow the crowd.20 For example, individuals are more likely to impulse buy if they are shopping with their peers and families than if they are shopping alone.21 Websites that use excessive urgency messaging based on the activity of other users, use disguised social media advertisements or deploy ‘confirmshaming’ may rely on this effect.

Loss aversion is the tendency of people to increase the relevance of (potential) losses in their decision-making relative to corresponding gains. This discrepancy has been labelled the endowment effect, because the value that people associate with something appears to change when it is included in one’s endowment.22 The loss aversion mechanism may play a role in (for example) drip pricing or difficult cancellations traps.

Such insights from behavioural economics are important because they can be used to support benevolent policy objectives: a policymaker can ensure that ‘choice architectures’ are designed such that people are prompted to make better decisions, without impeding on their freedom to choose.

For example, healthy food options can be displayed more prominently within a canteen or supermarket in order to promote a healthier lifestyle. This is often referred to as ’nudging’, a term which was popularized by Richard Thaler and Cass Sunstein; Thaler received the 2017 Nobel Prize in Economic Sciences for his contributions to behavioural economics.23

However, insights into behavioural economics are not only or always used for good. More recently, Thaler coined the term ‘sludge’ to refer to all activities ‘that are essentially nudging for evil’.24 For example, any type of subscription that is easy to sign up for but incredibly difficult to cancel can be considered a ‘sludge’. Governments can implement sludges ‒ for instance by making it more difficult for people to register to vote. The only practical difference between a sludge and a dark pattern therefore seems to be that a sludge is a concept that can be applied much more broadly to both offline and online practices, whereas the concept of dark patterns is more specifically applied to online practices.25

It is useful to think in terms of choice architecture that either increases or decreases the decision-making ‘friction’. A decrease in friction makes a certain option more likely to be chosen by consumers. An increase in friction does the opposite, making a certain option less likely to be chosen by consumers. Thus, dark patterns could be either insufficient friction (leading to consumers being more likely to make poor decisions) or excessive friction (leading to consumers being less likely to make good decisions.

The above raises the question: ‘what’s new?’ Human cognitive biases are not new, nor is our understanding of them. So is this discussion around dark patterns not simply old wine in new bottles?

3 What’s new: digitalization and ‘hypernudging’

Although behavioural economics provides an established framework for understanding dark patterns, the advent of dark patterns cannot be explained with behavioural economics alone. In particular, two key developments in the digital space have made dark patterns much more pervasive: minimal costs and vast gains.26

3.1 Minimal costs

First, the costs of experimenting with different user interface designs have decreased tremendously. In a digital environment, businesses are able to experiment effectively with different website designs through A/B testing (where some users are shown a different layout to others). This allows businesses to investigate the effects of changes in the user interface at little cost. Cheap A/B testing can help webpage developers to quickly identify ways to incrementally improve the design of a webpage to the benefit of users or to exploit biases and change user behaviour for the benefit of the interface designer.

3.2 Vast gains

Second, the scale of potential gains has vastly increased as a direct result of increased digital distribution and globalization. With many online businesses operating at a large scale, even minor changes to a user interface can have huge benefits for the business in absolute terms.

To see the relevance of both lower costs and higher gains in the digital world, consider this in contrast with a physical grocery store changing the layout of its shelves to entice customers to buy its more profitable products ‒ for example, by placing them in more prominent areas.27 It is clear that the owner can experiment with different layouts. However, implementing such an experiment offline is much less practical than it is online: it takes a significant amount of time to set up different variants of the shelves and to expose these to a sufficient number of (ideally randomized) shoppers. Moreover, once a more profitable layout has been identified, the grocery store owner is able to roll out this more profitable layout only across its own stores.

In contrast, an online business can readily implement and experiment with different webpage designs, assigning users randomly to each design and retaining those designs that show better performance. There is relatively little cost involved in changing the choice architecture ‒ and if the online business operates globally, even minor improvements can immediately be implemented at scale.

This has even raised concerns of ‘hypernudging’,28 where the online choice environment for consumers is adapted in real-time and even personalized to the needs and vulnerabilities of individual consumers based on extensive data collection on online consumer behaviour in order to stimulate more sales.29 While offline shops can display their products in only one way, online shops can condition their interfaces on customer-specific behavioural clues.30

4 What to do: competition policy and regulation

As concerns around dark patterns increase, policymakers, regulators and authorities are faced with the complex challenge of determining what role they should play. For example, can we trust the competitive process to ‘compete away’ undesirable dark patterns, or is intervention or self-regulation required? And, if so, in what form?

4.1 Why competition policy may not be enough

Deceiving users can be bad for business: the threat of customers switching to rivals may pressure firms to abstain from (at least) the most aggressive and obvious forms of dark patterns. Research has shown that users ‒ if they engage in active searching and can easily detect the use of dark patterns ‒ may punish the most aggressive forms of dark patterns if they have options from competitors available.31

However, a vast literature in behavioural industrial organization shows that dark patterns can be employed even in the absence of dominance in the market, especially if consumers do not tend to search for alternatives or are otherwise vulnerable, as firms might find it more profitable to engage in exploiting consumer biases, rather than competing in ‘debiasing’ consumers.32 Moreover, the literature shows that firms may also use more subtle dark patterns to further limit the extent to which consumers search.33 Theoretical behavioural economics research has also shown that, in the presence of consumer biases, firms may be better off increasing certain forms of behavioural exploitation ‒ such as drip pricing ‒ under increased competition.34 Nobel Prize laureates George Akerlof and Robert Shiller even claim more generally that competition can pressure firms to ‘phish for phools’ (i.e. exploit human behavioural weaknesses), because if they do not, they will be replaced by competitors that will.35

Where dark patterns are employed by a dominant firm, issues can potentially be addressed through competition law. In order to understand whether some forms of dark patterns could constitute an abuse of dominance, and therefore amount to an infringement of competition law, one must consider whether such conduct is an accepted category of abuse that is likely to be investigated by competition authorities or accepted by a court. Some forms of dark patterns can potentially be tackled as an exclusionary abuse. In this case the competition authority should show that the use of dark patterns by a dominant firm enables it to prevent competitors, in whole or in part, from profitably entering or remaining active in a given market (and which, as an indirect result, will ultimately have a detrimental impact on consumers). Alternatively, competition authorities may tackle some forms of dark patterns as an exploitive abuse, as is discussed in the context of personalized pricing practices. This would require meeting a high burden of proof by showing that personalized pricing is a form of excessive or unfair pricing, in which prices are both high relative to costs and unfair in and of themselves.36 However, neither of these tools are likely to suffice in tackling emerging practices in a timely manner.

In short, neither competition law enforcement nor a healthy competitive process are likely to eliminate the risk of exploitative dark patterns, but that does not mean that competition policy does not have a role to play. Even if competition might not be a sufficient remedy to solve market inefficiencies generated by firms’ exploitation of consumer behavioural biases, competition is still likely to reduce the harm such inefficiencies cause, even if only for some groups of consumers.

4.2 Regulation and consumer protection

Consumer protection and regulation thus also have a substantial role to play. Indeed, consumer protection law more generally recognizes that markets cannot solve all forms of consumer exploitation. While competition authorities address mostly business-to-business relationships, consumer protection authorities focus on business-to-consumer (B2C) interactions, meaning that they are well-placed to look at potentially exploitative practices like dark patterns or personalized pricing.

Consumer protection law offers a range of policy tools that can have an important role in addressing existing concerns about dark patterns and unfair practices. In the EU the Unfair Commercial Practices Directive (UCPD)37 prohibits unfair B2C practices related to a commercial transaction. The Directive identifies two broad (non-exhaustive) categories of unfair practices: (1) misleading commercial practices, and (2) aggressive commercial practices. In principle, researchers and policymakers agree the UCPD appears sufficiently broad and flexible to cover and sanction many of the potentially unfair commercial practices employed in the digital sphere, including dark patterns.38 It provides for case-by-case assessment of each conduct, but also sets a blacklist of practices that are considered unfair irrespective of the circumstances; and new practices related to digital were recently added to the Directive (for example, hiding sponsored/paid promotions in search results and the falsification or misrepresentation of consumer reviews or social endorsements).

In addition, the two legislative initiatives to upgrade the rules governing digital services in the EU have recently come in force: the Digital Services Act (DSA)39 and the Digital Markets Act (DMA).40 The DSA sets out a harmonized framework of rules for a safe, predictable and trusted online environment and the DMA regulates the practices of the largest gatekeeper platforms, which offer platform services such as online intermediation, social media, operating systems, cloud services and search engines. While targeted specifically to online practices, the DSA and DMA are largely complementary to the UCPD and together form a single set of new rules that will be applicable across the EU to create a safer and more open digital space.

Some potentially unfair digital practices can also be tackled by the application of privacy and data protection laws. According to the General Data Protection Regulation (GDPR),41 dark patterns may be classified as unfair processing insofar as they constitute an insufficient provision of information or a misleading display or design.

The legal assessment conducted by the Commission in its study on dark patterns reveals that, although a strong EU legal framework exists which is perceived as sufficiently flexible to cover most unfair commercial practices (i.e. the UCPD, consumer protection legislation and the DSA and DMA), some legislative adjustments are likely to be necessary to ensure that authorities can adequately respond to manipulative dark patterns and personalization.42

4.3 Enforcement

The wide availability of different policy tools allows for a more flexible approach in the treatment of emerging dark patterns but also imposes an additional challenge for authorities to decide which tool to use, e.g. whether to open a competition investigation or a consumer protection investigation when faced with a new case. This may, in most countries, require coordination between competition authorities and consumer protection authorities, particularly where competition law and consumer protection law are enforced by different agencies.

Several competition authorities have already undertaken actions that promote cooperation as well as better awareness and prevention. For example, the European Data Protection Boards,43 the ACM44 and the French Data Protection Authority (CNIL)45 have released guidelines on dark patterns.

The ACM guidelines contain concrete cases of unfair commercial practices and discuss compliance with the UPCD. The CNIL encourages best practices with regards to the user interface/experience in the context of GDPR, arguing that interface design is an essential medium through which the implementation of the GDPR is played out.46

In addition, the CMA recently published several papers analysing and summarizing the evidence on online choice architecture and consumer harm47 and the Commission has published a study on dark patterns.48 The CMA has also launched consumer protection cases focussing on online choice architecture, including in November 2022 against Emma Group for using urgency messaging such as misleading countdown clocks in its online choice architecture.49 The OECD has published a paper discussing consumer policy issues associated with consumer data practices and online unfair practices,50 and is working on the topic of dark patterns and consumer vulnerabilities.51

In the EU, several enforcement actions have been taken in both the public and private enforcement areas. For instance, following a coordinated Consumer Protection and Cooperation (CPC) network action, the two large online platforms, Booking.com and Expedia, improved the presentation of their accommodation offers, aligning them with EU consumer law.52 Recently the Norwegian Consumer Council (NCC) and other European consumer organizations filed legal complaints against Amazon for creating obstacles for consumers to unsubscribe from its Prime service53 and against Google for tracking users without their consent.54 Similarly the CNIL found that Google had infringed these provisions by not making information regarding the processing of personal data easily accessible.55 In addition, the Directive on Representative Actions for the collective interests of consumers56 may further play a crucial role for the enforcement of consumer law in the digital environment in some countries. The Directive will come into effect from 25 June 2023 and will require Member States to appoint qualified entities that will be empowered to launch collective actions for injunctive and/or redress measures, including for breaches of the UCPD and GDPR. The purpose of the Directive is to raise consumer awareness and interest in filing individual legal complaints.

4.4 Analysing the effects of dark patterns

At the same time, regulators and enforcers are conscious that not all behavioural influencing is harmful to consumers and there may be a substantial risk to innovation and product improvement if regulators and competition authorities take a dogmatic approach to dark patterns and behavioural influencing more generally. Therefore, for practices that are not broadly recognized as being harmful to consumers, effective enforcement requires a case-by-case assessment.

Such a case-by-case assessment would require a further understanding of the economic effects of the various online practices to determine whether they are truly harmful to consumers and competition.

This brings us to the next step in the debate on dark patterns: how to assess their actual effects or harms. Assessing effects can benefit all parties involved. In particular, it can:

  • support regulatory authorities in two main ways, i.e. understanding the conditions which make behavioural influencing harmful and assessing the preferred prioritization of their work;

  • support firms in identifying exploitative practices and managing their regulatory risk; and

  • support users and firms in estimating the appropriate amount of compensation where a dispute arises around the harm suffered.

Although more research and thinking is required in this area, there are several promising avenues to pursue:

  • auditing for dark patterns;

  • using data on consumer outcomes to identify dark patterns; and

  • A/B testing.

a Auditing for dark patterns

It can be worthwhile to check whether online choice architecture has the ‘right amount’ of friction, or contains dark patterns. This is a more holistic approach than the traditional audit-led approach, although the popular term for this analysis is called a ‘sludge audit’ or a ‘dark pattern audit’.

Drawing on the insights from behavioural economics and literature on dark patterns, a dark pattern audit is a qualitative review of online choice architecture. It can also include elements of quantitative analysis, e.g. analysis of reading age.

In carrying out a sludge audit, it is important to keep the target market in mind (and customer segments within the target market). For example, it should be considered how the target market varies in terms of:

  • literacy (e.g. reading age) and financial literacy;

  • experience in interacting with the market in question; and

  • potential vulnerability.

b Using data on consumer outcomes to identify dark patterns

In the context of detecting dark patterns, businesses and regulators can use data on consumer outcomes to identify dark patterns. This is a similar tool to business model analysis, which is used by regulators (including the Financial Conduct Authority) to identify where consumer outcomes and commercial outcomes are not aligned.57 This can include the following steps:

  • Identifying any segments of customers with poor outcomes. This is likely to involve segmenting or clustering customers according to their behaviour (e.g. how they use the product).

  • Identifying whether the business model is reliant on the profitability generated from customers who receive poor outcomes. If so, then there may an issue with the online choice architecture.

  • Assessing whether the online choice architecture has sufficient ‘friction’ in the choice to behave in a way that leads to poor consumer outcomes.

c A/B testing of dark patterns

Randomized controlled experiments, often referred to as ‘A/B testing’ in online settings, can quantify the effect of specific features of online choice architecture on consumer decisions and therefore outcomes.

A/B testing is the ‘gold standard’ for understanding the impact of online choice architecture on consumer decisions, as it is conducted ‘in real life’. Firms may first wish to test features of their online choice architecture in a controlled environment, such as a lab or a controlled online setting, at least where there is a risk of the experiment causing poor outcomes. Once the firm is confident that poor outcomes are not likely, an A/B test could be undertaken in the market itself.

If the results of A/B tests provide clear evidence that behavioural influencing is used responsibly, then this should be given due weight by authorities. In the absence of clear evidence authorities can also conduct their own A/B tests to identify the effects of different user interface designs on users.

There are different forms of A/B testing, each of which will be appropriate in different circumstances.

  • Stated preference testing refers to choice environments where consumers are asked to state their opinions, emotions, preferences, etc.

  • Revealed preference testing refers to choice environments where consumers are asked to make decisions that reveal their preferences. Revealed preference tests may be conducted in a controlled environment, where decisions are a ‘role play’, or in ‘the real world’.

In general, stated preference testing is helpful in uncovering consumers’ underlying motivations and emotions, while revealed preference testing gives a higher level of confidence in the external validity of the results.

5 Conclusions

This article sets out the behavioural economics research that explains why dark patterns are effective in influencing consumer decisions. It also describes the commercial incentives and technological developments that drive firms to implement these techniques.

Traditional competition law is unlikely to be sufficient to mitigate the risks of exploitative dark patterns on its own. Regulation and consumer protection law have a substantial role to play, especially in protecting the most vulnerable consumers. Such a wide availability of different policy tools allows for a more flexible approach in the treatment of emerging dark patterns, but also requires efficient coordination among the various authorities.

As regulators and enforcers are conscious that not all behavioural influencing is harmful to consumers, effective enforcement will require a case-by-case assessment of the effects on competition and consumers of practices that are not broadly recognized to be harmful. At the same time, firms should also seek to better understand the effects of their practices in order to effectively comply with existing and new regulatory requirements.

Although more research and thinking is required in this area, economics provides a range of tools that can be used to identify dark patterns and to distinguish them from more benign examples of online choice architecture, including an assessment of outcomes for different groups of consumers, and A/B testing to assess the impact that a particular piece of online choice architecture can have on consumer decision-making.

  • 1

    We adopt the term ‘dark patterns’ throughout the article as it has become established in the literature, but we understand some have raised concerns around the term and that ‘deceptive patterns’ might be a more inclusive way of referring to the phenomenon going forward.

  • 2

    See

    Commission press release, New Consumer Agenda: European Commission to empower consumers to become the driver of transition (IP/20/2069, 13 November 2020).

    • Search Google Scholar
    • Export Citation
  • 3

    See

    ACM press release, ACM publishes update to the rules regarding online deception (31 October 2022) and ACM Guidelines, Protection of the online consumer. Boundaries of online persuasion (11 February 2020), available at: https://www.acm.nl/sites/default/files/documents/2020-02/acm-guidelines-on-the-protection-of-the-online-consumer.pdf (accessed 24 November 2022).

    • Search Google Scholar
    • Export Citation

    An English language version of the updated Guidelines is not yet available.

  • 4

    CMA, Online platforms and digital advertising, Market Study Final Report (1 July 2020).

  • 5

    See

    FTC press release, FTC to ramp up enforcement against illegal dark patterns that trick or trap consumers into subscriptions (28 October 2021).

    • Search Google Scholar
    • Export Citation
  • 6

    See

    FTC press release, FTC streamlines consumer protection and competition investigations in eight key enforcement areas to enable higher caseload (14 September 2021).

    • Search Google Scholar
    • Export Citation
  • 7

    This term was originally coined in 2010 by Harry Brignull, a user experience specialist, who defined dark patterns as ‘tricks used in websites and apps that make you do things that you didn’t mean to, like buying or signing up for something’: see https://www.deceptive.design/ (accessed 24 November 2022). Another common definition is that of user interfaces that lead consumers into making decisions that benefit the online business, but that users would not have made if they were fully informed and capable of selecting alternatives: see

    Mathur A., Acar G., Friedman M., Lucherini E., Mayer J., Chetty M. & Narayanan A. , '‘Dark patterns at scale: findings from a crawl of 11K shopping websites’ ' (2019 ) 3 (CSCW) Proceedings of the ACM on Human-Computer Interaction : 1 -32.

    • Search Google Scholar
    • Export Citation
  • 8

    On this point Benartzi and Lehrer conclude that ‘The ironic takeaway is that, in the age of information, we are less able than ever before to process information, since our attention is all used up’: see

    Benartzi S. & Lehrer J. , The Smarter Screen: What Your Business Can Learn from the Way Consumers Think Online , (Hachette UK, 2015 ).

    p.14.

  • 9

    Benartzi and Lehrer (fn 8), p.22.

  • 10

    See https://www.deceptive.design/ (accessed 24 November 2022) and

    OECD, Roundtable on Dark Commercial Patterns Online: Summary of discussion (19 February 2021), available at: https://www.oecd.org/officialdocuments/publicdisplaydocumentpdf/?cote=DSTI/CP(2020)23/FINAL&docLanguage=En (accessed 24 November 2022).

    • Search Google Scholar
    • Export Citation

    A literature review of studies that attempt to detect dark patterns can be found in

    OECD, ‘Dark Commercial Patterns’, OECD Digital Economy Papers, no. 336 (October 2022), available at: https://www.oecd.org/digital/dark-commercial-patterns-44f5e846-en.htm (accessed 24 November 2022).

    • Search Google Scholar
    • Export Citation

    See also

    European Commission, Behavioural study on unfair commercial practices in the digital environment: Dark patterns and manipulative personalisation, Final Report (16 May 2022).

    • Search Google Scholar
    • Export Citation
  • 11

    See https://en.wikipedia.org/wiki/Roach_Motel (accessed 24 November 2022).

  • 12

    European Commission (fn 10), pp. 18 and 56.

  • 13

    Early advances in behavioural economics were made particularly by Daniel Kahneman and Amos Tversky: see, in particular,

    Kahneman D. & Tversky A. , '‘Prospect theory: an analysis of decision under risk’ ' (1979 ) 47 (2 ) Econometrica : 263 -292.

    and

    Tversky A. & Kahneman D. , '‘Advances in prospect theory: cumulative representation of uncertainty’ ' (1992 ) 5 (4 ) Journal of Risk and Uncertainty : 297 -323.

    • Search Google Scholar
    • Export Citation

    Kahneman even received the Nobel Prize in Economic Sciences in 2002 for ‘having integrated insights from psychological research into economic science, especially concerning human judgment and decision-making under uncertainty’: see https://www.nobelprize.org/prizes/economic-sciences/2002/press-release/ (accessed 24 November 2022). It is widely considered that Tversky would have won this prize jointly with Kahneman, had he not passed away several years before. In 2011, Kahneman published a bestselling book in which he summarizes much of his research: see

    Kahneman D. , Thinking, Fast and Slow , (Farrar, Straus and Giroux, 2011 ).

  • 14

    Simon H. , '‘A behavioral model of rational choice’ ' (1955 ) 69 (1 ) The Quarterly Journal of Economics : 99 -118.

    Conlisk J. , '‘Why bounded rationality?’ ' (1996 ) 34 (2 ) Journal of Economic Literature : 669 -700.

  • 15

    For an early account for this, see, in particular,

    Tversky A. & Kahneman D. , '‘Judgment under uncertainty: heuristics and biases’ ' (1974 ) 185 (415 ) Science : 1124 -1131.

    See also Kahneman (fn 13).

  • 16

    We omit here a discussion on whether cognitive biases may be caused by irrationality (i.e. inconsistency in revealed preferences) or whether they are the result of a rational response to cognitive costs (i.e. the mental efforts involved in making decisions).

  • 17

    This is also discussed in

    T. Hogg and R. Van Dijk, ‘Danger mouse: the opportunities and risks of digital distribution’, Agenda (April 2021), available at: https://www.oxera.com/insights/agenda/articles/danger-mouse-the-opportunities-and-risks-of-digital-distribution (accessed 24 November 2022).

    • Search Google Scholar
    • Export Citation
  • 18

    See, for an early account of this, W. Samuelson and R. Zeckhauser ‘Status quo bias in decision making’ (1988) 1(1) Journal of Risk and Uncertainty 7–59.

  • 19

    See

    Mittone L. & Savadori L. , '‘The scarcity bias’ ' (2009 ) 58 (3 ) Applied Psychology : 453 -468.

  • 20

    See

    Sherif M. , The Psychology of Social Norms , (Harper, 1936 ).

  • 21

    See

    Luo X. , '‘How does shopping with others influence impulsive purchasing?’ ' (2005 ) 15 (4 ) Journal of Consumer Psychology : 288 -294.

  • 22

    Tversky A. & Kahneman D. , '‘Loss aversion in riskless choice: A reference-dependent model’ ' (1991 ) 106 (4 ) The Quarterly Journal of Economics : 1039 -1061.

    • Search Google Scholar
    • Export Citation

    and

    Thaler R. , '‘Toward a positive theory of consumer choice’ ' (1980 ) 1 (1 ) Journal of Economic Behavior & Organization : 39 -60.

  • 23

    and

    See also

    L. Fields, ‘The science of misbehaving: Richard Thaler wins the Nobel Prize’, Agenda (October 2017), available at: https://www.oxera.com/insights/agenda/articles/the-science-of-misbehaving-richard-thaler-wins-the-nobel-prize (accessed 24 November 2022).

    • Search Google Scholar
    • Export Citation
  • 24

    Thaler R. , '‘Nudge, not sludge’ ' (2018 ) 361 (6401 ) Science : 431 -431.

  • 25

    On the link between nudging and dark patterns, see also

    Waldman A. , '‘Cognitive biases, dark patterns, and the ‘privacy paradox’ ' (2020 ) 31 Current Opinion in Psychology : 105 -109.

    and

    Bösch C., Erb B., Kargl F., Kopp H. & Pfattheicher S. , '‘Tales from the dark side: privacy dark strategies and privacy dark patterns’ ' (2016 ) 4 Proceedings on Privacy Enhancing Technologies : 237 -254.

    • Search Google Scholar
    • Export Citation
  • 26

    See also OECD, Roundtable of Dark Commercial Patterns Online (fn 10). In addition to the reasons outlined, concerns have been raised about dark patterns that take advantage of biases and preferences at the level of individual consumers based on their data and previous usage patterns: see, for instance,

    Stigler Center, Stigler Committee on Digital Platforms: Final Report (16 September 2019) (Stigler Report), available at: https://www.chicagobooth.edu/research/stigler/news-and-media/committee-on-digital-platforms-final-report (accessed 24 November 2022).

    • Search Google Scholar
    • Export Citation
  • 27

    Were the owner of the grocery store to take a more paternalistic approach (i.e. more in line with nudging), they might choose to place the healthier products on the most prominent shelves instead.

  • 28

    European Commission (fn 10), pp. 40–41.

  • 29

    and

    A. Tuinstra, ‘Consumer protection in the online economy’, Agenda (March 2020), available at: https://www.oxera.com/insights/agenda/articles/consumer-protection-in-the-online-economy (accessed 24 November 2022).

    • Search Google Scholar
    • Export Citation
  • 30

    For a discussion on this point in the context of sponsored ranking, see

    ACM, Sponsored Ranking: an exploration of its effects on consumer welfare (2 February 2021), para 64, available at: https://www.acm.nl/en/publications/sponsored-ranking-effects-consumer-welfare (accessed 24 November 2022):

    • Search Google Scholar
    • Export Citation

    ‘While this may serve consumers in the sense that it facilitates their search, it also means that individual consumers’ consideration sets, i.e. the set of products they compare out of the vast amount of products offered on the platform, can be tailored to extract their maximum willingness to pay. Possibly even more than that, if they perceive the sponsored product to be better than it actually is because it is higher up in the ranking’.

  • 31

    Stigler Report (fn 26), pp. 12 and 211, and

    Day G. & Stemler A. , '‘Are dark patterns anticompetitive?’ ' (2020 ) 72 (1 ) Alabama Law Review : 1 -45.

  • 32

    The seminal paper on this is

    Gabaix X. & Laibson D. , '‘Shrouded attributes, consumer myopia, and information suppression in competitive markets’ ' (2006 ) 121 (2 ) The Quarterly Journal of Economics : 505 -540.

    • Search Google Scholar
    • Export Citation
  • 33

    R. Spiegler, Choice Complexity and Market Competition (2016), available at: https://doi.org/10.1146/annurev-economics-070615-115216 (accessed 24 November 2022).

    • Search Google Scholar
    • Export Citation
  • 34

    Hossain T. & Morgan J. , '‘Plus Shipping and Handling: Revenue (Non) Equivalence in Field Experiments on eBay’ ' (2006 ) 6 (2 ) The B.E. Journal of Economic Analysis & Policy : 1 -27.

    • Search Google Scholar
    • Export Citation

    Carlin B. , '‘Strategic price complexity in retail financial markets’ ' (2009 ) 91 (3 ) Journal of Financial Economics : 278 -287.

  • 35

    A broader discussion on this topic is provided, for example, by

    A. Tuinstra, S. Onderstal and J. Potters, ‘Experimenten voor mededigingsbeleid’ KVS Preadviezen 2020 (December 2020).

  • 36

    This refers to the two-limb United Brands test in excessive pricing cases, see e.g.

    Niels G., Jenkins H. & Kavanagh J. , Economics for Competition Lawyers , (Oxford University Press, 2016 ).

    Section 4.10.

  • 37

    Directive 2005/29/EC concerning unfair business-to-consumer commercial practices in the internal market [2005] OJ L 149/22, as amended by Directive (EU) 2019/2161 as regards the better enforcement and modernisation of Union consumer protection rules [2019] OJ L 328/7.

  • 38

    BEUC, ‘Dark patterns’ and the EU consumer law acquis. Recommendations for better enforcement and reform (9 February 2022), available at: https://www.beuc.eu/position-papers/dark-patterns-and-eu-consumer-law-acquis (accessed 24 November 2022).

    • Search Google Scholar
    • Export Citation
  • 39

    Regulation (EU) 2022/2065 on a Single Market for Digital Services (Digital Services Act) [2022] OJ L 277/1.

  • 40

    Regulation (EU) 2022/1925 on contestable and fair markets in the digital sector (Digital Markets Act) [2022] OJ L 265/1.

  • 41

    Regulation (EU) 2016/679 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation) [2016] OJ L 119/1.

  • 42

    BEUC (fn 38), p. 13.

  • 43

    European Data Protection Board, Guidelines 3/2022 on dark patterns in social media platform interfaces: How to recognise and avoid them (14 March 2022), available at: https://edpb.europa.eu/system/files/2022-03/edpb_03-2022_guidelines_on_dark_patterns_in_social_media_platform_interfaces_en.pdf.

    • Search Google Scholar
    • Export Citation
  • 44

    ACM (fn 3).

  • 45

    CNIL, ‘Shaping Choices in the Digital World - From dark patterns to data protection: the influence of ux/ui design on user empowerment’ (2019) 6 IP Reports [ ], available at: https://www.academia.edu/38856784/Shaping_Choices_in_the_Digital_World_From_dark_patterns_to_data_protection_the_influence_of_ux_ui_design_on_user_empowerment (accessed 24 November 2022).

    • Search Google Scholar
    • Export Citation
  • 46

    CNIL (fn 45).

  • 47

    CMA, Online Choice Architecture: How digital design can harm competition and consumers, Discussion Paper (CMA155, April 2022), available at: https://www.gov.uk/government/publications/online-choice-architecture-how-digital-design-can-harm-competition-and-consumers (accessed 24 November 2022);

    • Search Google Scholar
    • Export Citation

    CMA, Algorithms: How they can reduce competition and harm consumers (19 January 2021), available at: https://www.gov.uk/government/publications/algorithms-how-they-can-reduce-competition-and-harm-consumers (accessed 24 November 2022);

    • Search Google Scholar
    • Export Citation

    CMA, Online platforms and digital advertising (fn 4).

  • 48

    European Commission (fn 10).

  • 49

    CMA press release, CMA investigates online selling practices based on ‘urgency’ claims (30 November 2022).

  • 50

    OECD, ‘Good Practice Guide on Consumer Data: Avoiding Deceptive and Unfair Practices’, OECD Digital Economy Papers, No. 290 (23 September 2019), available at: https://doi.org/10.1787/e0040128-en (accessed 24 November 2022).

    • Search Google Scholar
    • Export Citation
  • 51

    See e.g.

    OECD, ’Dark commercial patterns’, OECD Digital Economy Papers, No. 336 (October 2022), available at: https://www.oecd.org/digital/dark-commercial-patterns-44f5e846-en.htm (accessed 24 November 2022).

    • Search Google Scholar
    • Export Citation
  • 52

    European Commission press release, More transparency: Following EU action, Booking.com and Expedia align practices with EU consumer law (IP/20/2444, 18 December 2020).

    • Search Google Scholar
    • Export Citation
  • 53

    Forbrukerrådet (Norwegian Consumer Council) press release, Amazon manipulates customers to stay subscribed (14 January 2021).

  • 54

    Forbrukerrådet (Norwegian Consumer Council) press release, Google under investigation based on complaint by the Norwegian Consumer Council’ (4 February 2020).

    • Search Google Scholar
    • Export Citation
  • 55

    European Data Protection Board press release, The CNIL’s restricted committee imposes a financial penalty of 50 Million euros against GOOGLE LLC (21 January 2019).

    • Search Google Scholar
    • Export Citation
  • 56

    Directive (EU) 2020/1828 on representative actions for the protection of the collective interests of consumers [2020] OJ L 409/1.

  • 57

    R. Van Dijk, ‘Should we be cross about cross-subsidies? Experience from the financial services sector’, Agenda (March 2017), available at https://www.oxera.com/insights/agenda/articles/should-we-be-cross-about-cross-subsidies-experience-from-the-financial-services-sector (accessed 24 November 2022).

    • Search Google Scholar
    • Export Citation

Contributor Notes

An earlier version of this article was published as T. Klein, ‘Bits of advice: the true colours of dark patterns’, Agenda (November 2021), available at: https://www.oxera.com/insights/agenda/articles/bits-of-advice-the-true-colours-of-dark-patterns (accessed 24 November 2022). This article represents the views of the authors only and not necessarily those of their affiliations.