top of page
  • Karan Shelke

Regulating Big Data Competition

[Karan Shelke is a student at Maharashtra National Law University, Mumbai.]

Several years ago, The Economist noted that data generated through various online portals used by billions of people, particularly consumer data, had become “the new raw material of business: an economic input almost on a par with capital and labour”. The modern computing system has expanded our ability to collect, store and analyse the data into certain algorithms which can identify a general human pattern or behaviour which can be influenced in such a way as the inventor wants him to. As artificial intelligence and machine learning promises to make big data analytics a central feature of virtually every area of business and commerce, various competition lawyers and economists race to define big data in the parlance of anti-trust laws.

What is Big Data?

The choice of word in the current debate concerning antitrust and the digital economy, however, is often not simply “data”, but “big data” – another concept lacking a common definition.[1] The latter is often described as large amounts of different types of data, produced at high speed from multiple sources, whose handling and analysis require new and more powerful processors and algorithms.

Personal data are often provided voluntarily by consumers, for example, in social networks or online shops. Customers provide information in exchange for – often zero-priced and advertising financed – products and services. The companies get information not only about addresses, email-contacts, date of birth or payment details, but also about shopping preferences or, in some cases, even photos or videos.

Economic Value of Big Data

Marketing is based on specific market research, which comprises of systematic data collection, processing and analysis. According to the Organisation for Economic Co-operation and Development (OECD), big data is a core economic asset that has the potential to create significant competitive advantage for companies and drive innovation and growth.

The benefits of big data include the development of new goods and services that are based on data, improved production, improved marketing through targeted advertisements, and better research and development. Access to data can also enable firms to exploit new business opportunities. By reusing data gathered in the context of one service for a different purpose, undertakings may provide new services based on these data.

Today, internet has become more accessible than it ever was. Companies such as Alibaba, Facebook, Google, Microsoft, Amazon, and Apple are major companies with data-driven business models among the top 10 companies by market capitalisation.[2]

Data-driven business models that monetise data primarily based on big data have made some technology companies the most highly valued companies today in terms of market capitalisation.

Anti-competitive Concerns arising from Big Data

With such huge competitive advantage for companies controlling data sets, the question arises whether the collection and exploitation of data can raise entry barriers for companies entering the market and not having such assets at their disposal. The barriers can be technological, legal, or behavioural in nature.

The OECD has noted that market concentration and dominance are favoured by the economics of data. It points out that data-driven markets can lead to a “winner takes all” situation where market success leads to concentration.[3]

The German cartel office gave a press release in which Facebook was informed in writing of its preliminary legal assessment in the abuse of dominance proceeding due to its policies pertaining to data collection which is forcing the people on Facebook to accept such a condition. The authority’s assessment focused on Facebook's terms in relation to data policy reserving Facebook’s right to collect users’ data from third party sources and use them (these data would then be merged with users’ data collected on Facebook and linked to a specific user’s account irrespective of the settings parameters that the user may have chosen).

Big Data as an Essential Facility: Approach by US and EU

1. United States of America

The "essential facilities doctrine" is a specialised form of the dominant company's duty to deal with its competitors. It is difficult or economically wasteful to duplicate the facility due to high fixed costs and low marginal costs, which is why natural monopolies are existing. The Supreme Court in Trinko[4] virtually eliminated the doctrine as a meaningful basis for liability under the American antitrust law. Justice Scalia in this case identified three critical harms that the essential facilities doctrine could create. First, compelling parties with a competitive advantage to share resources undermines the purpose of antitrust law by reducing incentives to invest in those resources. Second, compelled sharing would require federal courts to act as central economic planners, a role they are ill-equipped to play.[5] Third, sharing might actually create opportunities for collusion, which the court characterized as the “supreme evil of antitrust.”[6]

The classic case of essential facilities is a situation where a competitor of the dominant company wants to become a new consumer of the latter and seeks access to the physical resource required for the survival of the competitor which is controlled by the dominant company by offering certain sum or service.

2. European Union

In the EU, a unilateral refusal to grant access to an essential facility is just one example of a potentially unlawful refusal to deal. Something resembling the essential facilities doctrine first provided a basis for EU abuse of dominance liability in Commercial Solvents v. Commission.[7] Later, in Oscar Bronner, the European Court of Justice set forth more clearly the necessary elements for successfully advancing such a claim, including indispensability, i.e., the “essential” character of the product or facility that a dominant firm refuses to share.[8]

The Court also emphasized on the indispensability test. The fact that it may not be economically viable for the firm requesting access to replicate the facility because of its smaller size is not enough to support the conclusion that the refusal to give access is illegal under Article 102 TFEU.[9]


Today major data model companies continue to dominate the market. The question these companies can hamper the competition is something which is needed to be examined.

The scepticism of US Courts while dealing with the doctrine of essential facilities in anti-competitive cases has created a divergence in approach in such cases, while European Courts generally tend to utilize the doctrine as an effective tool to oversee competition in the market. The risks posed by the doctrine have been highlighted briefly, and the same warrant a thoughtful application.

In the meantime, all the data-driven companies will continue to collect such data and, along with it, some of them will gain a significant amount of leverage over consumers and their rivals until a common understanding has been reached by the major jurisdictions around the world over the treatment of big data.

[1] Hu, Han, Yonggang Wen, Tat-Seng Chua and Xuelong Li. “Toward Scalable Systems for Big Data Analytics: A Technology Tutorial.” IEEE Access 2 (2014): 652-687.

[2] S&P capital IQ: Top 10 companies' (worldwide) market capitalisation on the following days: 29 December 2006, 30 December 2011, and 29th December 2017.

[3] OECD (2014), p. 7.

[4] Verizon Communications Inc. v. Law Offices of Curtis V. Trinko, LLP, 540 U.S. 398 (2004).

[5] Ibid.

[6] Ibid.

[7] IstitutoChemioterapicoItaliano S.p.A & Commercial Solvents v. Comm’n, 1974 E.C.R. 223.

[8] Oscar Bronner GmbH v. Media Print Zesting’s und Zeitschriftenverlag GmbH, 1998 E.C.R. I-7791.

[9] Treaty on the functioning of European Union, article 102.


Related Posts

See All


bottom of page