Are big data and algorithms next on the European Commission’s competition agenda?

Margrethe Vestager has made it clear that the European Commission is looking into potential competition issues arising from big data. Whether tech giants and other big data collectors should brace themselves for the next big showdown with the EU’s competition regulator remains to be seen. However, it is never too early to consider precisely what the competition issues are surrounding big data.

Existing commentary and national enquiries point to two principal activities as being subject to antitrust investigation. These are the algorithmic processing of big data and the collection of data that may create or enhance dominance and raise barriers to entry.

The algorithmic processing of data inevitably finds itself at the epicentre of a competition law discussion. In their book titled Virtual Competition, Professors Ezrachi and Stucke foresee the “end of competition as we know it” as a result of the increasing use of algorithms on markets. Competition concerns over algorithms are linked to the impact they can have on prices, particularly concerning the monitoring of market prices and the implementation of price-fixing agreements. Price-fixing is of course not new to competition law, its algorithmic manifestation would merely bring about new collusive means.

The more intriguing concern lies with tacit collusion. The algorithmic arsenals at the disposal of market players could allegedly enable a ‘God view’ over the market, which would see competitors reacting to price changes instantly, diminishing any incentive to compete on price.  Identifying and tackling such conduct would certainly prove challenging for competition enforcers using present-day legal tools. All the more so where an illusion of competition is at play through the increased transparency on prices that algorithms can deliver.

The task of regulators may become unsurmountable when considering that artificial intelligence advances could result in algorithms strategizing such tacit collusion themselves. The question that lingers when taking machine learning into account, is whether our prevalent antitrust frameworks can be relied on to address infringements which are not the direct result of human conduct.

On the other end of the big data and competition law spectrum, it has been suggested that the exclusivity over big data that is valuable and unique could constitute a barrier to entry, enhancing the data controller’s market power. This could occur where new entrants do not have access to such data or cannot collect it themselves.

study by the French and German competition authorities suggests that a company’s refusal to provide access to data can be anticompetitive. This refers to the ‘essential facilities’ doctrine, which concerns abuses of a dominant position emanating from a refusal to deal by supplying or licensing.

For such abuse to be demonstrated vis-à-vis big data, considerable requirements must be met from an EU law perspective. The data must be indispensable to compete and the refusal to grant access to the data must prevent the emergence of a new product and be likely to exclude all competition in the secondary market. It becomes clear that the source and the nature of the data, as well as the purpose of its processing, will be of pivotal importance in ascertaining any foreclosing effect of a refusal to provide access to data.

Impeding the development of algorithms may ultimately deprive consumers of increased transparencies and narrow their landscape of choices. Obliging market players to grant access to data they have collected could frustrate incentives to innovate on the back of such data. It is therefore imperative that enforcers take cautious steps in assessing new market practices involving big data and ensuring that the appropriate distinctions are drawn between varying practices.

Article by Anastasios A. Antoniou, Partner, head of Competition PracticeArticle originally published on Linkedin

error: