Out-Law News 4 min. read

Harm from digital design risks enforcement action, UK authorities warn


Businesses operating in the UK face enforcement action under data protection, consumer protection and competition law if their design of websites and apps leads to user risks or harm, two authorities have warned.

In a joint blog post, Stephen Almond, executive director for regulatory risk at the Information Commissioner’s Office (ICO) and Will Hayter, the Competition and Market Authority’s (CMA’s) senior director in its Digital Markets Unit, said they want to see “improvements” in digital design practices.

“Used responsibly, online choices can be designed to empower users to make effective and informed choices about the way their personal information is used in digital markets, building customer trust,” they said, urging businesses to build online interfaces around customer interests and preferences; empower user choice and control through design; ensure design choices are “evidence based”; and consider the implications of data protection, consumer and competition law in design practices.

They said: “If we don’t see improvements, the ICO will be taking enforcement action to protect people's data protection rights, particularly where design practices lead to risks or harms for people at risk of vulnerability. The CMA has been clear that this is a priority area and will continue to tackle problems caused by harmful design through its consumer and competition enforcement powers.”

Alongside the blog, the ICO and CMA issued a press release, and also published a joint position paper on harmful design in digital markets (42-page / 1.34MB PDF) which outlined their thinking on how ‘online choice architecture’ (OCA) practices can undermine consumer choice and control over personal information.

The new paper, issued under the umbrella of the Digital Regulation Cooperation Forum, follows on from a CMA discussion paper published last year in which it identified 21 types of OCA as capable of being harmful to consumers, including ‘drip pricing’; introducing charges towards the end of the purchase journey; ‘sludge tactics’, which make it more difficult for consumers to do what they want for example by adding friction to a cancellation process; ‘dark nudges’ such as one-click purchases; and scarcity and popularity claims, for example informing consumers about limited stock or limited time to buy.

The ICO and CMA said in their new joint paper that poor OCA practices can infringe data protection laws and lead to, or heighten, risks of harm – such as by bringing about “unwarranted intrusion”, “make it unduly difficult for users to choose freely how their data is processed”, and “increase the amount of time users must spend to make informed choices about personal information processing or to take actions that align with their privacy preferences”. They said the impact could be “more acute” when users are vulnerable.

“For example, poor OCA practices could lead to a user with a gambling addiction consenting to the use of their personal information for targeted advertising when they would not otherwise have done so,” they said. “This could lead to them being shown a gambling advert which encourages them to gamble, in turn leading to financial loss and possible negative impact on their mental health.”

The ICO and CMA said competition and consumer protection harms related to personal data processing can also arise from poor OCA practices.

“For instance, firms may use OCA practices to nudge consumers towards choices in a way that reinforces their market position and therefore could weaken competition,” they said. “For example, this could be done by using OCA to collect more personal data from consumers than they would be willing to give by choice and by preferencing data collection for the firm’s own services over its competitors.”

They warned that having access to more consumer data could enable firms to “leverage network effects” in a way that disadvantages rivals, and further highlighted how OCA can be used to distort consumer choices by making certain options easier or more desirable to choose over others.

They said: “This can discourage more conscious deliberation of choices (e.g., by undermining the ability to process and assess information independently, or making it more difficult to shop around); misrepresent choices available to consumers, and; lead consumers to consent to potentially undesirable services or actions (e.g., to access a desired functionality). This can result in ill-considered or inadvertent decisions that may decrease consumers’ welfare and may not align with their preferences.”

As part of its ongoing OCA work, the CMA is already investigating certain online selling tactics that it suspects could harm consumers, and has recently urged an online retailer to change its practices. Next year, the CMA’s enforcement powers are expected to be substantially strengthened once the Digital Markets Competition and Consumers Bill is passed by the UK parliament. For example, the CMA will be empowered to directly decide if consumer protection laws have been broken and to punish such breaches by imposing fines up to 10% of a company’s global annual turnover. It will also have the ability to fine individuals involved in breaching consumer laws.

The risk of harm from digital design practices has also been recognised by EU policymakers. In 2022, the European Commission coordinated a ‘sweep’ that found an increasing use of ‘dark patterns’, which it described as “interfaces designed in a way that push consumers into making choices that may not be in their best interest”. The exercise found that more than a third of websites scrutinised potentially violate EU legislation on unfair commercial practices because of their use of dark patterns. Consumer body BEUC has called for more consistent enforcement of the Unfair Commercial Practices Directive (15-page / 305KB PDF) to address the issue.

The risk of dark patterns is also specifically recognised in the EU Digital Services Act. Under the legislation, online platform providers must not “design, organise or operate their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions”.

‘Very large’ online platforms are also specifically obliged to assess the systemic risks stemming from the design, functioning and use of their services – including by taking into account whether and how the design of their recommender systems and any other relevant algorithmic system plays into those risks.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.