Free Air & Hot Racks: New Paradigms in Handling ICT Heat
Handling our gear's heat has always been an issue for installations large and small. ICT equipment typical took 1x-2x again more energy to remove its heat as it took to power it in the first place (PUE of 2.0+), driving both energy costs and carbon footprints. Early efforts focused on the two obvious tactics: make both the gear and the air conditioning more efficient. We now see these augmented by innovative new approaches to the problem.
'Free air cooling', 'air side cooling', or 'air side economization' goes beyond efficient chilling of recirculated interior air to use of outside air to provide low-energy cooling. This has made locations where the air is below 13°C most hours of the year particularly attractive.
Running equipment rooms under 25°C has always been the standard of data center cooling, but new technologies and operating practices are pushing that limit. This allows for both less cooling using conventional techniques and the use of free air at higher exterior temperatures. The American Society of Heating, Refrigerating and Air Conditioning Engineer's Technical Committee 9.9 on Mission Critical Facilities, Technology Spaces and Electronic Equipment (ASHRAE TC9.9) has raised it's recommended maximum to 27°C, but the industry is already pushing beyond that. sgi/Rackable, for example, claims its Cloudtrack C2 cabinet can support server room temperatures up to 40°C and Microsoft's new Dublin data center is said to be running at 35°C.* Google, one the other hand, typically runs its data centers at ~27°C.
Here is a survey of tactics, basic and innovative, that data centers are using to reduce their cooling loads.
- You can see examples of various hot/cold aisle containment techniques in this article's slideshow of German data centers.
- The Google/Belgium and Microsoft/Dublin data centers both combine higher operating temperatures and air side cooling to eliminate the massive air conditioning units (chillers) typically associated with large data centers. They are referred to as 'chillerless' data centers. (See more about these and other mega data centers.)
- According to HP, its Wynyard (UK) data center uses the "legendary cold wind blowing off England's North Sea. This glacier-cooled coastal air, often bone-chillingly icy, is being innovatively harnessed into a new technology tool: lowering temperatures of IT equipment and plant rooms for an anticipated annual energy saving of 40 percent compared to conventional data centres." HP lists these features:
- Eight 2.2m diameter fans in each of the four halls in the data centre used to supply air and another eight used to exhaust air
- A mixing chamber in the facility recirculates air to maintain conditions in the 5m-high pressurised plenum below the computer equipment
- Humidification and cooling coils in the data centre to tune the outside air condition and remove contaminants
- Innovative cooling solutions are not just for larger centers. Associated Banc-Corp, a regional bank serving the upper midwestern United States, converted cooling at two data centers from compressors to heat exchangers and expects to save US$115,000 per year in energy costs.
- Voonami (UT-USA) has a new data center using evaporative cooling that is "along with other engineering innovations, …expected to trim energy costs by 80 percent over a typical giant data center." The technology, particularly suited to dry climates, is planned to be the sole cooling mechanism about eight months of the year.
- Interxion's Stockholm data center combines free air cooling and seawater cooling with waste heat reuse and 100% renewable energy. "[In 2009, we] designed and implemented one of the first seawater cooling systems that not only reduced energy costs on the Stockholm campus by 80 percent, but also lowered our PUE from 1.6 to 1.09." This PUE is better than many mega data centers which can use scale to drive efficiency.
- Click here for Canadian data center design that proposes "seasonal ice cooling" and a Norwegian facility also cools with seawater.
- Click here for companies offer liquid-cooled servers.
Some cautionary notes.
- Innovative cooling technology alone will not lower energy costs if not properly operated. US EPA data suggests many data centers to not properly operate their economizers.
- White paper authors from Dell and APC take issue with the "higher is better" hypothesis: "It is anticipated that increasing data center temperatures will reduce the…energy consumption by the data center cooling infrastructure. However, the dynamic nature of IT Equipment cooling fans may diminish or even negate the cooling system gains. Server fans will typically respond to a demand for increased airflow as inlet temperature to the server reaches or exceeds 25°C (77°F), consequentially increasing server energy consumption…In the three scenarios studied, the lowest energy use occurred anywhere between 24°C (76°F) and 27°C (81°F). The trigger in each case is the IT fan power increasing and exceeding the incremental decreases in the energy required to cool."
- Facebook's Prineville data center: "The built-in penthouse houses the chiller-less air conditioning system that uses 100% airside economization and evaporative cooling to maintain the operating environment." This lead to some initial problems with humidity and condensation, causing server shut down. The problems and their solutions are presented in an Open Compute blog. Innovation is rarely without challenges, the the payoff in PUE for Facebook has been substantial.
Data centers are not the only ICT facilities using innovative cooling. A British broadcaster is using free air cooling to lessen the HVAC load from studio lighting and equipment. A California postproduction company is using cold air bypass values in its edit suites to cool the office space. More