data centres
Data centres have become increasingly important over the past 60 years, and are now critical to many businesses’ operations. (Photo by Paul Faith/AFP via Getty Images)

Over the years, data centres have established themselves as the tool to guarantee uninterrupted energy to support companies’ computing systems and data storage. In a post-Covid world of prolonged working from home, their importance to the infrastructure industry will be paramount.

However, in spite of them playing such an integral role to companies and industries the world over, few people pay much attention to how data centres operate, given their often dull, functional appearance and somewhat complicated working practices. But this out of sight, out of mind approach is becoming less and less of an option.

A brief history of data centres

Put simply, a data centre is a physical facility that organisations use to store their computer systems, applications and data. They are structured as rooms and/or buildings that provide space, power and cooling for network infrastructures.

They are essentially rooms full of routers, switches, firewalls, storage systems, servers and application delivery controllers whose role is to make sure that their owner’s data can be stored, shared and managed 24/7, 365 days a year with virtually no interruptions.

Companies using them depend heavily on the reliability of data centres to make sure IT operations and data sharing are always functioning, which makes the constant element to data centre activity essential.

It is hard to pin down an exact date when data centres first appeared on the market, but the earliest examples can be traced to the banking sector in the 1950s. Around that time, researchers at the Stanford Research Institute invented the Electronic Recording Method of Accounting (ERMA) computer processing system.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

ERMA was first demonstrated to the public in 1955, and first tested on real banking accounts in 1956. Production models (ERMA Mark II) were delivered to the Bank of America in 1959 for full-time use as the bank’s accounting computer and cheque-handling system. In 1961, Barclays opened the UK’s first computer centre for banking. The company, now called Automatic Data Processing, went public and leased its first computer, an IBM 1401, in 1961.

Another early application of data centres can be found in the American Airlines and IBM partnership to create a passenger reservations system offered by Sabre, automating one of its key business areas. The idea of a data processing system that could create and manage airline seat reservations, and instantly make that data available electronically to any agent at any location, became a reality in 1960, opening the door to enterprise data centres.

Through the decades, data centres went from predominantly being a real estate investment – where organisations would host them in-house – to an infrastructure business, where data centres are built off-site and involve the deployment of fibre, cables, air cooling systems and other equipment.

More recently, data centres have been shifting from an infrastructure model towards a subscription and capacity-on-demand one, where data is hosted on a non-physical platform: the cloud. The cloud – the non-physical place where data is stored, shared and managed – is becoming increasingly important, with some companies developing cloud-based-only data centres. Industry consensus, however, seems to be that not all data can be solely stored in the cloud and that for some companies physical data centres will remain relevant in the years to come.

Physical components

When built off-site and from scratch, data centres are ecosystems that rely on several physical components to function and guarantee the continuity of service that is so essential to companies’ IT systems. These components are:

• Electricity supply systems: this component is particularly important as the equipment in data centres is very energy-consuming, with electricity costs sometimes accounting for 10% of overall costs. Electricity is essential and needs to be guaranteed not only for IT but for cooling and access control systems as well. Back-up power systems consist of one or more uninterruptible power supplies and diesel generators, which also require an efficient electricity supply system.

• Cables, routers and switches: communication is essential in data centres and is often handled through networks running special protocols for computing equipment interconnection. The sites therefore contain sets of ports, routers and switches that transport traffic between data centres’ computing equipment and the outside world. Data cables are typically routed through overhead cable trays. Sometimes they go through raised floors for security reasons.

• Air conditioning: due to the high volume of electricity used to make nearly all components function, the temperature and humidity is high in data centres. Air conditioning is therefore essential to avoid failures and electricity interruptions.

• Fire protection systems: these systems include passive and active design elements, as well as the implementation of fire prevention programmes in operations. Smoke detectors are usually installed to provide early warning of a developing fire by detecting particles generated by smouldering components prior to the development of any flames.

Market players and site selection

Depending on the type of data centre, market participants in this space include data centre providers – who normally own the core and shell of the data centre, power and cooling equipment, exchanges and cross connects – and the customers, who are the companies that rent the premises from the providers and own servers, storage and network equipment.

Providers of other services including IT infrastructure, power management systems, construction and security solutions are also important players in this market and essentially act as vendors to the above-mentioned actors.

Site-selection criteria

As a data centre’s key feature is the ability to provide an uninterrupted service to its users, certain criteria must be considered when selecting a site for a new development. They include:

• Location: easy access to energy and good potential of electricity supply needs to be guaranteed. Locations there are exposed to natural disasters are therefore to be avoided, while proximity to highways, railways and airports are good for logistics reasons for staff and users.

• Infrastructure: the availability of electrical capacity as well as diverse power feeders are essential. Access to utilities for potential expansion and upgrades, as well as an assessment of previous outages, are also important.

• Water: diverse source supplies and water storage facilities are key to the good functioning of air-cooling systems.

• Economics: land, construction, utilities, labour and communications are all essential economic considerations.

Types of data centre

As data centres have evolved over the years, different types have taken shape. Sometimes they reflect the stage of innovation of the industry, but often they serve different purposes for different owners. The different types of centre include:

• Enterprise data centres: this is the oldest type of data centre on the market and the one that bears the highest degree of the real estate investment component. Enterprise data centres are private facilities owned by the company that uses them and are normally hosted in-house.

• Hyperscale data centres: these are enterprise data centres, but on a much larger scale, both in terms of location and scale of business. The International Data Corporation classifies any data centre with at least 5,000 servers and 10,000ft2 of available space as hyperscale. Microsoft’s twin 470,000ft2 complexes in Quincy, Washington, and San Antonio, Texas, are an example of hyperscale data centres.

• Colocation data centres: a colocation data centre is one where a company rents the premises from a third-party player. These are based off the company site. As previously mentioned, the data centre provider owns the infrastructure, including the building, cooling facilities, bandwidth, etc, while the company owns the components, including servers, storage and firewalls.

• Edge data centres: while enterprise and hyperscale data centres are located inside or near companies’ sites, edge data centres are positioned at the periphery of existing networks and send data to the end users. “Built for versatility and speed, edge data centres tend to be operated by colocation providers. For companies trying to penetrate a local market or improve regional network performance, these facilities are incredibly valuable,” says vXchnge – a colocation services provide – on its website.

• Cloud data centres: these are also off-site, and data and applications are hosted by a cloud services provider such as Amazon Web Services, Microsoft’s Azure or IBM Cloud, as well as other public cloud providers.

Tiers

Data centres are categorised in tiers according to the level of infrastructure components being used.

Although a Tier 4 data centre is more complex than a Tier 1 facility, it does not necessarily mean it is best-suited for a business’s needs.

With regards to tiers, this is Hewlett Packard Enterprise’s definition:

• Tier 1: A Tier 1 data centre has a single path for power and cooling, and few, if any, redundant and back-up components. It has an expected uptime of 99.671% (28.8 hours of downtime annually).

• Tier 2: A Tier 2 data centre has a single path for power and cooling, and some redundant and back-up components. It has an expected uptime of 99.741% (22 hours of downtime annually).

• Tier 3: A Tier 3 data centre has multiple paths for power and cooling, and systems in place to update and maintain it without taking it offline. It has an expected uptime of 99.982% (1.6 hours of downtime annually).

• Tier 4: A Tier 4 data centre is built to be completely fault-tolerant and has redundancy for every component. It has an expected uptime of 99.995% (26.3 minutes of downtime annually).

A promising future

During the outbreak of Covid-19, data centres proved their resilience. The telecoms industry points to them as safe harbours for investments as the pandemic massively accelerated the already-existing demand curve growth.

With large numbers of people having to work from home during the various lockdowns imposed – a situation that is expected to continue for many months, and permanently for some – tools such as video conference calls, to name just one, saw a sudden upsurge in usage.

Daily active users of Zoom increased to 200 million in March 2020 versus ten million in December 2019, a 3,857% increase in daily downloads in only three months.

“The speed at which this occurred is astounding: network traffic has increased since the global pandemic, with carriers reporting surges in their network traffic of more than 25% – AT&T in the US versus its baseline – and 50% – Vodafone in Europe versus its baseline,” says data centres and digital infrastructure investor Digital Colony. “As the pandemic puts high-capacity workloads on the network and data usage requires more bandwidth, additional digital infrastructure is needed to deal with these unprecedented times.”

With new technology development and the progress towards a digital economy, data is often seen as the new oil. Like oil, it will continue to build distribution points and the data centre infrastructure will have to continue to be built. Protectionism trends in line with which governments claim sovereignty and protection over their citizens’ data are expected to produce plenty of opportunities for investors.

The cloud

Cloud-based data centres have been rising in popularity in recent years with large companies shifting processes such as emails, instant messaging and other applications to non-physical storage.

A big phenomenon for the industry is the growth of public clouds such as those developed by Alibaba, Oracle and IBM.

Unlike private clouds, public clouds share computing services among different customers, even though each customer’s data and applications running in the cloud remain hidden from other cloud customers.

The key features and advantages of the public cloud include scalability, cost-effectiveness, utility-style payment methods, reliability, flexibility and location. While this is one of the sector’s new trends, it seems unlikely that data centres will in future solely exist on the cloud as certain company data will need to remain offline for security reasons.

It is likely, however, that data centres will have a smaller footprint as more components are digitised. What isn’t small, however, is the importance of the data centre. The dull, functional appearance masks an integral component of any company’s future strategies. Choosing the right kind of centre will only become more important in the post-Covid age.