This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

Freshfields TQ

Technology quotient - the ability of an individual, team or organization to harness the power of technology

| 5 minutes read

New tech team for UK competition regulator

The UK’s Competition and Markets Authority (CMA) announced last month that it would build a new technology team to review the use of algorithms, artificial intelligence (AI) and big data.

Andrea Coscelli, the CMA’s new chief executive, revealed to the Financial Times that the team would consist of data scientists, computer experts and economists. A Chief Data and Digital Insights Officer – a new position currently being advertised on the government’s website – will build and lead the team of (initially) 10 people. The successful applicant will play a key role in helping the CMA to define and deliver a data and digital insight strategy. Mr Coscelli has described the role as “the first of its kind in Europe”.

Further information about the CMA intention for the team’s role was also revealed this week in the CMA’s Annual Plan consultation 2018/2019. The CMA stated its intention to prevent the use of algorithms and AI becoming a vehicle for collusion, and prohibit the abuse of market power in digital markets. In particular, the CMA is interested in how companies use online data and the growth of algorithms in business decision-making, including price discrimination.

The CMA anticipates that the role of the team will include:

  • improving the CMA’s capture and use of data;
  • understanding how firms use data and algorithms, the interactions between consumer issues and data ownership and the implications for consumers and competition; 
  • establishing connections with tech businesses and research communities in the UK and internationally; and
  • building a relationship with the Information Commissioner’s Office, other relevant regulators and the growing government data community.

This team will help the CMA understand the increasingly sophisticated algorithms and automated tools deployed by businesses, which are not always easy to monitor. For instance, unlike employees, actions of algorithms are less likely to be detected from a trail of email correspondence, and therefore it can be difficult to ascertain the intention of a program or assign responsibility for its actions.

The new team will also assist the CMA in taking on UK aspects of cross-border digital cases in the post-Brexit landscape, which will no longer fall under the remit of the European Commission (EC).

What has the CMA done already?

Mr Coscelli told the Financial Times that, compared to five years ago, the work of the CMA is much more focused on digital technologies. The CMA has led a number of enforcement investigations where data and technology have been integral to the harms identified, for example: online auction platforms, online gambling, secondary ticketing and an Amazon Marketplace cartel. In the Amazon Marketplace case, the CMA imposed a fine of over £160,000 on an online seller of posters and frames for agreeing prices with a competitor, using automated repricing software.

The CMA has also undertaken two market-wide projects into this area: Commercial Use of Consumer Data; and Digital Comparison Tools (DCTs). One of the points raised by the CMA in the DCTs study included concerns regarding the use of sophisticated ranking algorithms for presenting results, which may be difficult for consumers to understand.

The CMA Policy Director responsible for the DCTs market study, Will Hayter, recently appeared in front of the UK House of Lords Select Committee on AI on 7 November 2017 (see previous blog post for more detail on this committee). During the House of Lords session, Mr Hayter also discussed the relationship between algorithms and price-fixing:

“You might not necessarily have a smoke-filled room any more, but an email from one firm to another saying, “Let’s agree to fix prices”, even if an algorithm is being used to make that happen, is still susceptible to the same principle-based legislation, as we found in a case where two companies were fixing prices for posters on Amazon Marketplace, for example.”

What about competition regulators internationally?

This news from the CMA is part of a trend of greater scrutiny by competition regulators internationally into how algorithms and other automated systems are deployed in ever more sophisticated ways, fuelled by businesses’ ability to process and analyse ever-growing quantities and types of data.

For example, the EC recognised the potential risk to competition of automated price monitoring tools as part of its recent e-commerce sector inquiry. Specifically, the EC has opened an investigation into alleged restrictions on price setting by a group of consumer electronics manufacturers, including the use of automatic pricing adjustment software.

There is clearly some variation in the outlook of regulators from different jurisdictions as to whether current legislation is fit for purpose in dealing with the challenges of algorithmic decision-making. Last month, Australia’s top competition official Rod Simms delivered a speech asserting that Australia’s competition laws are well suited for dealing with these challenges, in particular because they have a set of revamped competition laws that focus on the effects of collusion.

This opinion contrasts with a less certain position taken by regulators elsewhere. For instance, the Chairman of the CMA, David Currie, has questioned whether the legal tools currently at the CMA’s disposal are capable of tackling all the challenges presented by the rise of the algorithmic economy, such as self-learning algorithms. As these kinds of issues crystallise, it is entirely plausible that the regulators will seek greater powers in order to address them.

How can businesses stay compliant?

It is important for regulators not to overlook the numerous benefits that algorithms and other automated systems can bring for consumers and competition – for example, by facilitating the adjustment of prices more quickly to an efficient price level, or reducing staff costs for businesses resulting in lower prices for consumers.

Despite such potential pro-competitive effects, there is little doubt that a number of regulators are concerned by algorithms’ market impact. They have clearly communicated the responsibilities of businesses to stay vigilant as they seek to deploy new digital technologies; European Commissioner Vestager stated in a recent speech: “businesses … need to know that when they decide to use an automated system, they will be held responsible for what it does.”

This means that legal and compliance teams within any business – whether ostensibly a “tech” company or otherwise – must closely monitor any algorithms and other automated systems being developed for use by the company, and consider whether their design and use might give rise to any competition concerns.

To take a hypothetical example, say a company is using an algorithm that is programmed to match the price change of a competitor by a pre-determined amount. This may eventually become detectable to others in the market, which could result in competitors responding in a similarly predictable way. The end point could be a pattern of “signalling” leading to the conscious (if ‘algorithmic’) alignment of pricing strategies across market participants. Therefore, even in the absence of an express agreement, such forms of concerted practice may attract attention from regulators.

Please reach out to one of the contacts in our competition team if you would like advice on whether aspects of the design or use of price setting or monitoring algorithms, or other automated systems, used in your business might raise competition issues, or you would like to discuss how a regulator might view these practices.

To find out more about additional areas of legal risk arising from AI, take a look at this short video.

Tags

artifical intelligence, digital, competition law, ai, automotive