Introduction

The UK Competition and Markets Authority (CMA) has recently published a Discussion Paper and accompanying Evidence Review on “Online Choice Architecture” (OCA). This provides a helpful overview of the CMA’s approach to analysing choice architecture, recognising that some practices are likely to be harmful to consumers but others may be beneficial. The Discussion Paper provides an overview of different types of OCA, explains how different OCA designs can harm consumers, and provides some observations on the effectiveness of possible remedies. It is the work of the CMA’s Data, Technology and Analytics (DaTA) unit, established in Spring 2019.[1] The CMA also held a panel discussion on this topic on 26 May, 2022, featuring Professor Cass Sunstein (joint author of ‘Nudge: Improving Decisions about Health, Wealth and Happiness’) as a keynote speaker.

The Discussion Paper builds on the CMA’s experience of considering potential issues raised by OCA in its competition and consumer law cases.

  • Under competition law, the CMA has recently considered potential issues arising from OCA in its markets work, for example, the Online Platforms and Digital Advertising and Mobile Ecosystems market studies. The CMA may also tackle issues arising from OCA under its antitrust powers.  Abuse of dominance cases advanced by the European Commission while the UK was part of the EU, such as Microsoft (Internet Explorer) and Google Shopping have involved consideration of OCA.
  • Under consumer law, existing UK legislation is targeted at preventing businesses from misleading consumers both online and otherwise. For example, the Consumer Protection from Unfair Trading Regulations 2008 (“CPRs”) prohibit ‘unfair commercial practices’ such as drip pricing and bait advertising, misleading actions and omissions, and pressure selling.[2] There are also specific rules for so-called ‘distance contracts’ concluded over telephone or the internet.[3]  The CMA has recently taken action against digital businesses under these rules. For example, in early 2022, it agreed commitments with Sony and Nintendo regarding auto-renewal of gaming subscriptions, as part of a consumer law investigation into online gaming.[4] It is also carrying out an ongoing consumer law investigation into fake online reviews.[5]  In addition, the UK government has put forward a set of proposals for reform to consumer law that would protect consumers from harm when they take out, renew, and cancel subscriptions online.[6]

OCA is also a hot topic with other competition agencies around the world, with the Norwegian Forbrukerrådet’s report on potential ‘dark patterns’ in Amazon’s subscription process,[7] the Australian Competition & Consumer Commission’s (ACCC) digital platform services inquiry,[8] (which examined search defaults and choice screens for internet browsers), the French information commission’s (CNIL) investigation into Google and Facebook’s cookie consents[9],  and the US Federal Trade Commission’s recent policy statement warning companies against using dark patterns to manipulate users.[10] The European Commission has been looking at these issues in the context of the Digital Markets Act and Digital Services Act, as well as in competition and consumer law enforcement, such as the EU’s Consumer Protection Cooperation action in relation to online hotel bookings.  The Discussion Paper refers to a number of these cases and indicates that the CMA will work with regulators in other jurisdictions to progress its work in this area.

What is OCA?

The CMA defines choice architecture as the “design of the environment” through which consumers experience and interact with digital products.  It states that choice architecture is a neutral term which encompasses a broad range of practices, not all of which are inherently harmful. The CMA acknowledges that OCA can be designed to help consumers, and most types of OCA “can be used beneficially and often are”.

The CMA states, however, that some forms of OCA can cause harm to competition and consumers, including in situations where it may also provide a benefit. For example, OCA can “hide crucial information, set default choices that might not align with our preferences, or exploit our attention being drawn to scarce products”.

The Discussion Paper lists 21 different OCA practices identified by the CMA in its review of academic literature and previous cases.  These OCA practices are categorised into: (i) “choice structure” (the design and presentation of options); (ii) “choice information” (the content and framing of information provided); and (iii) “choice pressure” (indirect influence of choices).

  • Choice structure. Businesses can choose how online choices are structured, including which options consumers can or are likely to see and how cognitively challenging or time consuming it is to make a choice, and how different options are ranked or presented. The CMA finds that there is strong evidence that choice structure practices change consumer decisions – for example, it states that defaults and ranking “exert a strong effect on consumer behaviour” and thereby affect competition.
  • Choice information. Choice architects are able to customise what information is provided to consumers when presenting choices, such as the basic details of the product or service. This information can be framed in ways that highlight certain aspects over others, make it harder to understand or access information, or hide information until consumers have progressed further through the process of their task. The CMA comments that: “manipulating choice information can reduce consumers’ ability to understand and evaluate key pieces of information […] which can distort consumer decision making out of line with their preferences and weaken competitive pressure”. It observes that although there are existing rules[11] regarding choice information practices – aimed at ensuring that accurate information is provided during consumers’ decision making process – these may have limitations, and may need to be supplemented by other types of remedies.
  • Choice pressure: The CMA comments that online businesses are able to exert pressure on consumers to make choices using indirectly related factors, such as consumers’ habits, time pressure or so-called “trusted messengers”. The CMA comments that “where they are fake or misleading, the scarcity or popularity claims and messengers (such as fake reviews) can be particularly harmful”. The CMA notes that existing remedies[12]generally aim to ensure that consumers are not unduly pressured in decision making, and that the information and tools provided … are relevant, genuine and valuable”.

The CMA notes that there is substantial crossover between these categories, because different types of OCA are “often interlinked in any given context, and … some practices may involve elements of multiple categories” and OCA practices tend to have stronger effects when they are combined.

Table 1: CMA Taxonomy of OCA Practices

Choice structure Choice information Choice pressure
Defaults Drip pricing Scarcity and popularity claims
Ranking Reference pricing Prompts and reminders
Partitioned pricing Framing Messengers
Bundling Complex language Commitment
Choice overload and decoys Information overload Feedback
Sensory manipulation Personalisation
Sludge
Dark nudge
Virtual currencies in gaming
Forced outcomes

Source: CMA Discussion Paper on Online Choice Architecture (page v).
Note: Red text denotes OCA practice identified as (potentially) harmful by the CMA

Potential harms from OCA

The Discussion Paper states that digitalisation of markets has increased the potential impact of OCA, and may introduce new issues.  It notes that digitalisation has led to “significant asymmetries” in the amount of information held by businesses and consumers, respectively, and that businesses have greater control in customising and optimising their interactions with consumers online. The CMA also states that OCA “may induce more impulsive, and therefore harmful, purchasing behaviour”, and that consumers’ natural behavioural biases can be exacerbated in the online world. In particular, the CMA finds that users behave differently online – for example, they act more quickly, have shorter attention spans, skim read, and are more likely to rely on recommendations.

The Discussion Paper considers three types of potential harm to consumers and competition that can be caused by OCA practices:

  1. Distortion of consumer behaviour. The CMA states that any business, regardless of its market power, can design OCA to distort consumer behaviour. For example, OCA can be used to influence consumers to purchase unneeded or unsuitable products, spend more than they want to, receive poor value items or services, choose inferior sellers or platforms, or search less for alternatives. Furthermore, OCA practices can also prompt consumers to disclose their data, engage more with a product, and share a product or service on social networks, which could lead to “unwanted marketing advances, privacy invasion, reduced enjoyment, or excessive use”.
  2. Weakening or distortion of competition. Because OCA can change consumer behaviour, the CMA finds it can also shift businesses’ incentives to compete on product attributes that benefit the consumer, such as quality and total price paid, towards “less beneficial attributes” such as price displayed upfront or pressure to buy. This can “weaken or distort competition on the merits of the products and may result in poorer quality, more expensive products, less efficient markets, and reduced trust”.
  3. Market power.  Certain OCA designs can be “particularly problematic” where a business has market power, because the business can use OCA to maintain or exploit its market power  through “limiting competition or squeezing rivals out.” For example, it states that OCA practices can exacerbate network effects if used unfairly to acquire or retain consumers, making it harder for rivals or entrants to compete. The CMA states that this can lead to poor outcomes for consumers, such as higher prices and lower quality or value for money, unfair contracts, compulsory data sharing, and limited options for switching.

The CMA also discusses four cross-cutting topics that are “relevant to the effectiveness of OCA and the potential harm caused”, namely:

  1. Prevalence. The CMA observes that various academics and authorities are continuing to research the extent to which OCA practices exist online, and in different sectors.
  2. Awareness and learning. The CMA finds that there is low consumer awareness of OCA, and even where consumers are aware, they may still be influenced. As a result, “making consumers aware is therefore not always sufficient to protect them from harm”.
  3. Vulnerability. Harm caused by OCA practices can disproportionately affect vulnerable consumers. For example, they may be unable to cope as effectively with financial loss or negative feelings, or may not be able to complain, or avoid the same experience in future.  The CMA also observes that some types of OCA may be specifically targeted at vulnerable consumers.
  4. Algorithms. The CMA states that because algorithms allow users to use data at speed and scale in ways that drive many aspects of their online experience, they “play an important role in how the OCA of search engines appears to consumers”. The CMA states that whilst algorithmic personalisation can bring benefits, such as helping consumers find relevant products faster, it can also lead to harm, such as privacy invasion, opaque personalised pricing, discrimination against personal characteristics, or reduced information diversity.

Remedies

The CMA recognises that the considerable diversity of OCA practices means that choosing the right type and combination of remedies “depends heavily on the specific issue at hand”.  However, the CMA sets out some high level observations as to the effectiveness of potential relevant OCA remedies

First, some harmful OCA practices can be (and are) prohibited by legislation.  For example, the CPRs already prohibit using false time pressure to elicit immediate decision making, or causing consumers to enter transactions that they would not otherwise have taken through a misleading statement or omission. It also sets out a list of ‘hardcore’ commercial practices that will be deemed unfair in all circumstances.[13] The UK Government has proposed adding fake review practices to this list.[14]

Second, information based remedies can have limits, because consumers “have to not only be able to access information, but also to be able to assess and act on it”. As a result, the CMA concludes that information based interventions alone may not always be sufficient to significantly change consumer behaviour.  The CMA has suggested that authorities may need to take more drastic measures, such as mandating defaults.

Third, behavioural and data science can help to identify OCA practices and put them in context. This can assist in understanding how the relevant practices work, assessing their compliance with existing legislation, investigating harm, and testing potential remedies.

Fourth, remedies can benefit from quantitative and qualitative testing (for example, surveys, field trials, and online experiments) both before and during their implementation, in order to improve their design, estimate their potential impact, and assess whether further interventions are required. This is consistent with previous CMA findings, for example in its review of Regulation and Competition in the UK, which recommended increased use of regulatory “sandboxes”.[15]

Comment

Various forms of OCA have been central to both past and current antitrust cases in digital markets in the UK and Europe. OCA practices are likely to become even more closely scrutinised in the future. The CMA has stated that harmful OCA is a “key area of focus”, and it will “more actively investigate practices that may harm consumers or competition using the full range of powers available”.  Furthermore, facilitating effective consumer choice is a fundamental tenet of forthcoming digital regulation, such as the Digital Markets Act[16], the proposed reforms to competition and consumer law in the UK[17] , and the proposed new regulatory regime for digital markets in the UK.[18]

Recent cases show that the CMA is prepared to review businesses’ OCA and take action where it considers this to be appropriate.  For example, in the Mobile Ecosystems market study, the CMA considered the OCA employed by various apps in relation to taking out and cancelling a subscription, and set out a series of recommendations for companies as to how they should comply with relevant consumer protection law.[19]  The Discussion Paper therefore represents a welcome formulation of the CMA’s thinking in this area.  Given that some forms of OCA may have beneficial rather than negative effects, it will be important for the CMA to investigate with a data-driven approach whether a particular form of OCA is having a negative impact in practice and whether there are any positive benefits that should be retained.  Where remedies are being considered, the CMA should work with businesses to consider, design and test any choice-related interventions, in order to ensure that they achieve the desired purpose to the benefit of consumers, and avoid unintended consequences.


[1]              See CMA, The CMA DaTA unit – we’re growing!, 28 May 2019.

[2]              See Consumer Protection from Unfair Trading Regulations 2008.

[3]              See Consumer Contracts (Information, Cancellation and Additional Charges) Regulations 2013.

[4]              See CMA, Press release: CMA welcomes Sony and Nintendo’s gaming subscription improvements, 13 April 2022.

[5]              See CMA, Press release: CMA to investigate Amazon and Google over fake reviews, 25 June 2021.

[6]              UK Government, Reforming Competition and Consumer Policy: Government Response to Consultation, April 2022.

[7]              See Forbrukerrådet, You can log out, but you can never leave, 14 January 2021.

[8]              See ACCC, Digital platform services inquiry – September 2021 interim report, 28 October 2021.

[9]              See CNIL, Cookies; the CNIL fines Google a total of 150 million euros and Facebook 60 million euros for non-compliance with French legislation, 6 January 2022.

[10]             See Federal Trade Commission, FTC to Ramp up Enforcement against Illegal Dark Patterns that Trick or Trap Consumers into Subscriptions, 28 October, 2021.

[11]             For example, Regulations 5 and 6 respectively of the CPRs set out the circumstances in which a commercial practice will be deemed a “misleading action” or “misleading omission”.

[12]             For example, Regulations 3 and 7 of the CPRs prohibit “aggressive commercial practices”: where a consumer takes (or is likely to take) a transactional decision they would not have taken otherwise, as a result of “harassment, coercion or undue influence” that significantly impairs (or is likely to significantly impair) their freedom of choice.

[13]             See CPRs, Schedule 1.

[14]             See Department for Business, Energy & Industrial Strategy (BEIS), Reforming Competition and Consumer Policy: Government Response to Consultation (2022), p. 69.

[15]             See CMA, Regulation and Competition – A Review of the Evidence, January 2020, paragraph 1.36 (“Policymakers should also consider making greater use of regulatory “sandboxes”to trial new regulatory approaches”) .

[16]             For example, the DMA is expected to mandate the display of choice screens for search engines, browsers, and virtual assistants on gatekeepers’ operating systems.

[17]             See BEIS, Reforming Competition and Consumer Policy: Government Response to Consultation (2022), p. 14 (setting out proposals to tackle “subscription traps”).

[18]             See UK Government, Government response to the consultation on a new pro-competition regime for digital markets, May 2022.

[19]             See CMA, Mobile Ecosystems market study, Appendix K: consumer experiences of app purchases and auto-renewing subscriptions to apps sold through the app stores.