The data centre market uses massive amounts of energy and water in its developing role as a key economic and research platform for this data-led era. The last year of WFH, home schoolings and binge-watching content, has witnessed immense growth in data generation and consumption.
The market will continue to experience huge growth partly due to online device installation up from 18.4 billion devices in 2018 to 29.3 billion devices expected by 2030. To support this expansion the network of data centres continues to grow, and with it, the necessary access to power, cooling and connectivity that drives this new world order.
Gartner predicts that public cloud services will grow to a $368 billion market by 2022, with all major countries experiencing between 15% and 33% growth. This can only serve to drive colocation expansion and that will develop the regional cores that service a multitude of Edge deployments.
This is illustrated by the 70-plus new build data centre projects underway in 12 European countries in 2021, totalling an additional 851,000m2 of new space (ResearchAndMarkets – European Data Centre Market Report 2021-2025). The number of data centres needed will continue to rise with the explosion of IoT, autonomous vehicles and increasing heavy use of data in AI-led applications and development.
Furthermore, the growth of small edge data centres, possibly drawing up to 10kW per site, will require a roll out of tens of thousands of units across individual countries to monitor, analyse and take action on specific applications such as intelligent road and vehicles, hospitals and many more. As well as location activity, much of that data will need to be uploaded to data centre hubs for storage or further processing, requiring massive data storage facilities all requiring power and water.
The Uptime Institute recently stated that, average data centre PUE in 2020 was 1.58. This has not significantly improved in the last 7-years. Many data centre developers are still wedded to a chilled air-cooled approach to technology spaces, which rolls out the older style fan assisted servers. This legacy approach consumes large amounts of water and up to 30% of the data centre’s energy in cooling, while restricting the server capability at a time when greatly increased data throughput is expected. A more enlightened approach whether in data centres or at the Edge is chassis-level precision immersion liquid cooling technology, which has a significantly lower PUE 1.03.
Liquid cooling is 1000 times more efficient than air cooling and eliminates the requirement for refrigerants. It also removes the need for server fans while dramatically increasing the compute density that can be effectively managed in each server rack. Within the technology space liquid cooling within HPC environments allows guaranteed operational temperatures within tightly configured server clusters.
The classic design of a data centre is to use cool airflow across the hot components within server racks to maintain the operational temperature within the technology suite. It is often the case that more energy is used to power the mechanical systems to cool the technology hall than is used to run the servers, ITE and network equipment. At a time when AI (Artificial Intelligence)-based applications require high-performance computing (HPC), air cooling and the excessive water it uses is no longer the most effective strategy to cool the technology suite. Pressure from investors, customers, legislators and the public is causing the industry to rethink data centre cooling strategy.
Template to optimise energy
Liquid cooling of servers is the most energy-efficient way to drive the data centre industry forward. It provides a template to optimise energy use in the technology suite, so more power drives the applications on the servers, rather than the cooling systems. As local 'edge' data centre designs are implemented, these too benefit from liquid cooling technology over air cooling techniques, in what are much more constricted spaces.
Today’s view on sustainability and reductions in CO2 from data centres is partly driven by the cost considerations as energy and water become more expensive and less available, as well as the threat of legislation to reduce emissions. Also, the massive increase in AI and ML (Machine-Learning) applications which require HPC and GPU-rich servers to process compute-dense workloads has increased the average server power usage from 5kW/rack to upwards of 15kW/rack and in instances like Iceotope’s Ku:l 2 up to 42kW/rack. These HPC configurations no longer work effectively with air-cooled processes.
The HPC market was valued at $39.1 billion in 2019 and is expected by research companies to grow at a compound annual growth rate (CAGR) of 6.5% from 2020 to 2027. This market is also referred to as supercomputing and involves the use of increasingly large parallel processing clusters to throughput the data at high-speeds and accuracy reducing results timescales. This capability is making HPC a must have for many corporations, governments and research facilities, which is contributing to the market’s growth.
As new-build and legacy data centres consider the requirements to accommodate HPC environments, with their concentrations of servers, the heat created must be effectively removed and dealt with. Precision immersion technology can now capture and efficiently reject >97 percent of the heat generated.
Liquid cooling techniques have become more flexible in the devices that can now be accommodated into the systems. To greatly increase energy efficiency in legacy data centres it is possible and cost effective to retrofit chassis level liquid cooled servers in similar racks in the technology suites. In fact, a shared space can greatly increase the efficiency of the space. It is not unusual that technology suites layouts have void spaces where the air cooling is not effective. These areas can be used by liquid-cooled technology, increasing the total space utilisation, creating a more efficient suite.
Today, the opportunity and requirement for many data centres is to review their energy and water usage against best practice and liquid cooled server technology to understand how they can create a roadmap to upscale their compute capabilities for the next growth market. This also provides the financial, space and emissions benefits that liquid cooled immersion techniques offers data centres from hyperscale to the edge.