Hands of a guy on laptop keyboard

The economic impact of data centres

Published on 18 September 2018
Updated on 05 April 2024

Developments in AI, IoT and smart devices, and other data‑heavy applications have led to a major increase in data traffic. It is estimated that global data centre traffic will triple by 2020. The increase in volume of data has increased the scale of development of data centres worldwide.

In the past few years, ‘data is the new oil’ has become a catchphrase used to highlight the immense value acquired by this twenty‑first‑century commodity. Although there are several important differences between the two resources (for instance, unlike oil, data generates more data), the analogy remains useful to describe the way information is used to power many modern technological applications.

Modern‑day factories for data

Data centres, used by almost all tech companies – from small startups to Silicon Valley giants – are modern‑day factories for data. Due to the increase in data volume, they have become economic assets important to local, national, and regional economies.

A report by the US Department of Commerce shows that large data centres bring in millions of dollars in initial investment directly to local communities, which in turn contribute to the surrounding areas. The initial investment directly creates construction jobs for the building of the data centres themselves, as well as public infrastructure, including roads, water services, and electrical and network infrastructures. Once built, data centres operating around the clock need to be manned by IT personnel, and security and operations staff.

Similar reports on the economic impact of data centres in countries like Norway, Finland, Netherlands, and the United Kingdom all confirm the same trend. According to Gartner, global investment in data centres reached $181 million in 2017, and is expected to surpass the $188 million mark this year. In the USA alone, investment in data centres was more than $20 billion. Data centres have also caught the eye of real‑estate investors, who consider them important alternatives to traditional real estate.

Data localisation rules and the economic factor

New data localisation requirements emerge regularly in different countries around the world. Although privacy protection, security, and law enforcement are among the main reasons behind data localisation regulations, the potential for economic growth and development is also why some governments are obliging companies to store their data within the borders of national jurisdictions, as well as to limit their ability to transfer locally collected data to other countries.

The question is whether data localisation measures and incentives are an efficient way of achieving economic growth or, reversely, a regime of data liberalism would be the better option. Where defenders of data liberalism might argue huge discrepancies in efficiency and costs, in a similar way to David Ricardo’s theory of Comparative Advantage (countries still engage in international trade even when the local labour force is more efficient at producing goods than workers in other countries), data protectionists might argue through Hamiltonian and List infant industry lines (an emerging industry needs extra protection from international competition). Both arguments have strong supporters.

Economic factors cannot be ignored, especially in regions which benefit from direct investment. Knowing the impact of data localisation rules, or that of other legal measures, on the deployment of data centres, can allow governments and the private sector to better prepare for the surge in global data centre traffic in the years to come.

Predictions for big data

By 2020, the volume of big data is predicted to increase from 4.4 zettabytes to roughly 44 zettabytes (1 zettabyte equals 1 billion terabytes). Originally, it was thought that the volume of data would double every two years, but the impact of the IoT has triggered larger volumes. AI is expected to further increase the volume of data. The rate at which data is being created is also increasing exponentially. For instance, Google processes over 40,000 search queries every second, i.e., more than 3.5 billion searches every day.

 

Pedro Vilela Resende Gonçalves is director and researcher at the Institute for Research on Internet & Society. He acts as an assistant curator for the Digital Watch Observatory. This blog was first published in the Geneva Digital Watch newsletter issue no. 33, on 31 August 2018.

Subscribe to Diplo's Blog