Recent trends in data centre cooling
Currently, keeping computer systems cool accounts for 70% of a data centre’s non-IT energy consumption. With the data centre industry growing by 12% annually, saving energy required for cooling is becoming a crucial issue and new options are being explored such as free-cooling, thermal energy storage or various means of liquid cooling.
Currently, keeping computer systems cool accounts for 70% of a data centre’s non-IT energy consumption. With the data centre industry growing by 12% annually, saving energy required for cooling is becoming a crucial issue and new options are being explored such as free-cooling, thermal energy storage or various means of liquid cooling.
According to Green Grid, a non-for-profit consortium committed to achieving data centre efficiency, there are many potential free-cooling locations on the globe. Green Grid even produced a free-cooling map, based on guidelines provided by ASHRAE and showing where in the world and for how many hours a year outside air could be used in place of air conditioners and other cooling devices. It appears that free cooling is available all year round in over 75% of North America and 97% of Europe.
One of the first IT firms to make the move towards free-cooling was eBay, in a data centre exclusively free-cooled thanks to a hot-water circulation device in Phoenix, USA, that managed to achieve a Power Usage Effectiveness (PUE) of 1.046 in August 2011, with outdoor temperatures reaching 46°C. Its partial PUE of 1.018 means that at peak efficiency, the IT equipment concerned was using more than 98% of all available power, most likely an unprecedented accomplishment. But Facebook also claims that its data centre in Washington was specifically built to only use free cooling and Google also has a fresh air cooling facility in Belgium which relies exclusively on free cooling, but is also testing nocturnal thermal energy storage in other parts of the world, in its Taiwan facility for instance, and also uses waters from the Baltic Sea to cool its servers in Finland. Using air to cool electronics is an easy but not necessarily energy-efficient option.
Several institutions and companies are turning to liquid cooling options, the Calfornia Institute for Telecommunications and Information Technology (Calit2) at University of California San Diego (UCSD) has become the inaugural test site for Cool-Flo, a system in which a coolant flows through negative-pressure tubes inside 3 1U servers on the UCSD campus. Another option featuring submersion cooling was first patented by BIOS-IT. Liquid submersion requires less physical space than air cooling. Other solutions involve encapsulating processor cards in liquid, as in Iceotope’s liquid cooling system, in which the waste heat generated by the servers is used to passively pump the coolant through the system. France-based geophysical services company CGG Veritas uses submersion racks, which have so far brought about a reduction in cooling costs of 40% and claim that overall it could reduce energy consumption by 95% - removing the need for air conditioning and provide 1200 times more heat retention than air.
[*] Power Usage Effectiveness, PUE is an indicator designed by Green Grid to compare a facility’s total power usage to the amount of power used by the IT equipment, revealing how much is lost in distribution and conversion. The average PUE for data centre facilities is about 1.8 and figures around 1.07-1.2 indicate ultra-efficient technology. However the PUE indicator has been criticized as having limited relevance as it covers an excessively large range of applications and workloads and also of making apples-to-oranges calculations.
According to Green Grid, a non-for-profit consortium committed to achieving data centre efficiency, there are many potential free-cooling locations on the globe. Green Grid even produced a free-cooling map, based on guidelines provided by ASHRAE and showing where in the world and for how many hours a year outside air could be used in place of air conditioners and other cooling devices. It appears that free cooling is available all year round in over 75% of North America and 97% of Europe.
One of the first IT firms to make the move towards free-cooling was eBay, in a data centre exclusively free-cooled thanks to a hot-water circulation device in Phoenix, USA, that managed to achieve a Power Usage Effectiveness (PUE) of 1.046 in August 2011, with outdoor temperatures reaching 46°C. Its partial PUE of 1.018 means that at peak efficiency, the IT equipment concerned was using more than 98% of all available power, most likely an unprecedented accomplishment. But Facebook also claims that its data centre in Washington was specifically built to only use free cooling and Google also has a fresh air cooling facility in Belgium which relies exclusively on free cooling, but is also testing nocturnal thermal energy storage in other parts of the world, in its Taiwan facility for instance, and also uses waters from the Baltic Sea to cool its servers in Finland. Using air to cool electronics is an easy but not necessarily energy-efficient option.
Several institutions and companies are turning to liquid cooling options, the Calfornia Institute for Telecommunications and Information Technology (Calit2) at University of California San Diego (UCSD) has become the inaugural test site for Cool-Flo, a system in which a coolant flows through negative-pressure tubes inside 3 1U servers on the UCSD campus. Another option featuring submersion cooling was first patented by BIOS-IT. Liquid submersion requires less physical space than air cooling. Other solutions involve encapsulating processor cards in liquid, as in Iceotope’s liquid cooling system, in which the waste heat generated by the servers is used to passively pump the coolant through the system. France-based geophysical services company CGG Veritas uses submersion racks, which have so far brought about a reduction in cooling costs of 40% and claim that overall it could reduce energy consumption by 95% - removing the need for air conditioning and provide 1200 times more heat retention than air.
[*] Power Usage Effectiveness, PUE is an indicator designed by Green Grid to compare a facility’s total power usage to the amount of power used by the IT equipment, revealing how much is lost in distribution and conversion. The average PUE for data centre facilities is about 1.8 and figures around 1.07-1.2 indicate ultra-efficient technology. However the PUE indicator has been criticized as having limited relevance as it covers an excessively large range of applications and workloads and also of making apples-to-oranges calculations.