The UK’s Competition and Markets Authority (CMA) last week published a paper by its Data, Technology and Analytics (DaTA) Unit on ‘Algorithms: How they reduce competition and harm consumers’, which considered potential harms from the use of algorithmic systems.
The paper reveals the CMA’s plans to launch a substantial programme of analysing algorithms to identify potential harms to competition and consumers, which is intended to equip the CMA with the knowledge to effectively regulate the use of algorithms and require firms to offer remedies.
To achieve this, the CMA intends to work closely with other UK regulators (such as the ICO, Ofcom and the FCA) and competition and consumer protection authorities in other markets, to share intelligence and take co-ordinated action in certain cases. We expect the paper will attract attention from regulators across the US and Europe – the Dutch Competition Authority has already announced its intention to carry out a market investigation into algorithms, while the German Government and the European Commission are developing their respective AI strategies.
The paper sets out the CMA’s roadmap to becoming one of the leading regulators in this sphere, covering three key areas:
Direct harms to consumers
The CMA has outlined several areas where it considers the use of algorithms could lead to negative outcomes for consumers, particularly:
- Personalisation. Where algorithms are used to offer consumers personalised pricing (although understood not to be widespread) or consumer journeys that result in what the CMA terms “artificial” changes to consumer behaviour.
- Algorithmic discrimination. Where personalisation leads to direct or indirect discrimination against vulnerable consumers.
- Unfair ranking and design. Where algorithmic systems are used to facilitate preferencing of certain products, services or suppliers for commercial advantage.
The CMA considers these practices particularly problematic because they are difficult for consumers to detect themselves and could be used on a scale that makes their effects more pronounced. These areas of focus are consistent with, for example, the CMA’s broader focus on protecting vulnerable consumers and the interaction between choice architecture (and so-called ‘dark patterns’) and consumer behaviour.
Exclusionary practices and collusion
The paper outlines how algorithms could facilitate harmful practices in the context of the CMA’s competition enforcement role. Here the CMA focuses on exclusionary practices in online markets, such as “self-preferencing” and manipulating ranking algorithms to exclude rivals (see the CMA’s Online Platforms and Digital Advertising Market Study in this regard).
The paper also considers how the use of algorithms could lead to collusion between competitors, for example:
- Explicit coordination. Where automated pricing mechanisms, with access to real time competitor data, reduce competitive tension and could make explicit collusion more stable.
- Hub and spoke coordination. Where firms use the same algorithmic system or delegate pricing decisions to a common intermediary, facilitating information sharing between competitors.
- Autonomous tacit collusion. Where pricing algorithms learn to collude without requiring other information sharing or existing coordination.
While the CMA notes that there is currently a paucity of evidence as to the extent of algorithmic collusion in markets, the creation of the DaTA Unit and publication of the paper suggests the CMA plans to further intensify its investigation into this area.
Although algorithms have been an integral part of how many markets and technology companies operate for decades, the CMA considers that advancements in algorithms (often in the form of artificial intelligence) have increased the risk of deliberate or unintended misuse. The CMA therefore sees a strong case for intervention and is considering proposals to ‘regulate’ algorithms, which include:
- working with consumers, developers and experts to formulate guidance and standards for businesses;
- encouraging businesses to set up internal or external ethical oversight mechanisms;
- ordering firms to disclose details of their algorithmic systems to competitors, auditors and regulators;
- requiring firms to conduct and publish algorithmic risk assessments of system changes or even change the operation of key algorithmic systems;
- imposing on-going monitoring and compliance reporting; and
- investigating potential breaches of consumer and competition law and taking enforcement action where necessary.
The CMA envisages that its work on algorithms will play an important ongoing monitoring role in digital markets, which will be overseen by its new Digital Markets Unit (DMU). The CMA is pushing for new powers to support its, and the DMU’s, objectives. The UK Government plans to publish a White Paper on this later this year.
Open questions: what’s next?
A focus on digitalisation is driving the CMA’s current enforcement agenda and the CMA’s focus on algorithms is consistent with its broader efforts to tackle misleading online commercial practices, which in its view have the potential to limit informed consumer choice and decision-making. As the CMA begins to expand its internal knowhow, we expect to see it increasingly use information gathering powers to require firms to explain how their algorithms work and provide underlying code.
However, the CMA’s progress towards regulating algorithms will depend on its ability to address at least the following challenges:
- the inherently complex and dynamic nature of algorithmic systems, which will make identifying the existence of any actual consumer harms – and quantifying their real-world impact – more difficult;
- the speed at which the CMA is able to progress investigations which involve an increased volume of underlying technical data and code and the extent to which such information is capable of evidencing the existence of consumer harms;
- the level of specificity the CMA will be able to provide about the harms it is seeking to investigate and the balance of such information with the increased information burden to which businesses will be subject;
- the need for the CMA to comply with data privacy and ethics requirements relevant to any data and code to which it would like to have access; and
- the appropriateness and ability of the CMA ‘regulating’ algorithms while also supporting business incentives to continue to innovate and generate other efficiencies from which consumers benefit.
While the CMA appears to recognise the potential for these challenges and has launched a consultation calling for views and evidence – from firms, developers and industry experts – to develop its own map of potential harms from algorithms, it remains to be seen just how far the CMA intends to go in unpicking the algorithmic capabilities underpinning a growing number of businesses in the digital economy.
To read more about these issues and developments globally, please see our 10 key themes report on global antitrust in 2021 – 10 key themes - antitrust, data and tech.
“the CMA plans to launch a substantial programme of analysing algorithms to identify potential harms to competition and consumers, which is intended to equip the CMA with the knowledge to effectively regulate the use of algorithms and require firms to offer remedies.”