These applications are also quite literally heating up the compute environments they are an essential component within. Data centres and enterprise compute centres have been targeting energy efficiency for a number of years, as economic and more recently social pressure has required them to reduce the energy wasted in the cooling of computational processes. Standard forms of cooling the technology spaces have relied on chilled air circulation across entire white spaces. These systems traditionally use mechanically refrigerated forced air to draw the heat from the server and comms racks. This aging process design has not seen a significant lowering of the industry’s average pPUE of around 1.6 over the past 7-years. However, evaporative cooling has gone some way to combatting this. Furthermore, recent events, such as home working, home schooling and social isolation, have increased loads across the data and communications infrastructure as it has been relied upon to keep businesses operational during lockdown.

The latest processors and their associated electronics are running 10X hotter than previous chip families driving up rack and cabinet energy requirements from the industry average 7kW to well above 70kW. Compute density continues to be transformed by Moore’s Law and the by-product of power in, is of course, heat out!

Last week, I saw in my local press that we are being advised to take shorter showers to save water as the warmer weather continues. Along with energy, the other major resource that data centres can consume – if using evaporative cooling – is water. I read a while back, that in the US, researchers expect that this year (2020) data centres will be responsible for the consumption of 660 billion litres of water (in the generation of grid power and onsite cooling requirements).  The cooling part to that equation is around 1.8 litres of water for every kWh the data centre consumes.  Reference: https://www.datacenterknowledge.com/archives/2016/07/12/heres-how-much-water-all-us-data-centers-consume

Technology is predicted to provide online access to over 66% of the global population by 2023. This expansion will only be sustainably achievable when a change in data centre design is accepted and implemented. This change will not only affect the design of large ‘out of town’ data centres, but also the thousands of Edge facilities that will appear in order for IoT and 5G to roll out successfully. A key design change must be the shift from air-cooled to liquid-cooled capabilities within the technology spaces. The innovation explosion needs the Edge and the Edge needs liquid cooling.

IT equipment vendors are repackaging current server and cooling technology as Edge-Ready. Many of these solutions are indistinguishable from the technology in clean room environments with air cooling, fans and required access to a continuous water supply.  This usually means that the housing must suit the technology, rather than the requirement for the Edge data centre to blend with its environment. The Edge is an opportunity to place technology close to the point of generation and use, where the active data is monitored, analysed and actioned in the locality whilst management updates and storage data is periodically communicated between the Edge and regional or national enterprise data centres.  This segregation will reduce network latency and optimise traffic volumes.

So, how do we effectively and efficiently relocate the processing power of the data centre, in a discrete minimised container suitable for harsh environments and close to point of use?  There are tried and tested cooling technologies that can demonstrate considerable savings in energy and other benefits to operators while adding value to customers.

Liquid cooling is not a new concept. Its association with HPC and number crunching financial and research environments might appear incongruous to colocation data centre and  Edge requirements, but recent advances in chassis-level systems places it in the sweet spot of existing infrastructure. Its efficiency is undeniable with a pPUE as good as 1.03, it provides industry leading energy usage effectiveness, which is an enviable target for any data centre. Our technology works with all kinds of servers. It is scalable, flexible and sealed, so there is no dirt, dust or gaseous ingress, providing longer MTTF and extended device life cycles.   

The elimination of server fans and forced air cooling dramatically reduces the size and cost of the systems and focusses energy on the processors. Water requirement is radically reduced, as is the floor space over an existing server configuration, with some estimates as high as 75% less floor area used. These capabilities are not just essential for the Edge, in respect of data centre CapEx savings, chassis-level liquid cooling systems require minimal changes to existing server infrastructure, whilst purging the need for mechanical refrigeration and offering game-changing reductions in energy and space consumption. And, of course, without the need for server fans the data centre space can operate in near silence.

Ongoing savings increase the benefit that liquid cooling provides to the data centre and Edge operators, with low maintenance requirements, improved bandwidth capabilities and reductions in latency. Due to its cooling characteristics it also offers access to reusable heat for local residential or business consumption, where that is available. Now that’s got be a hot tech prospect in a very cool way.

For more information on the most effective and cost-efficient data centre and Edge server cooling technology please get in touch.

Get in touch

i Enter your first name

i Enter your last name

i Enter your Email

i Enter your phone number

i Enter the name of the country you live in

i Enter the name of the company you work for


Iceotope cares about your privacy. We use the information you supply to provide you with the requested products and services, and contact you with relevant content and information that we think may be of interest. You may unsubscribe from these communications at any time. For more information, check out our Privacy Policy.