Insights

It’s a Hot Time to Keep Your Data Centers Cool

It’s a Hot Time to Keep Your Data Centers Cool

By John Stoepler September, 2018

Facebook just announced plans to build its first Asian data center in Singapore and its 15th overall. The project will reportedly cost $1 billion and will help Facebook accommodate the meteoric rise in demand of artificial intelligence, video streaming and other data as well as centralize its IT operations. Set to come online in 2022, the facility will stand 11 stories high and occupy 170,000 square-meters (1.8 million square feet).

Facebook-Singapore Data Center

Rendering of the Facebook Singapore data center. In addition to its state-of-the-art cooling system, the data center will use a facade made of perforated, lightweight material enabling better airflow.

When architecting the new data center, Facebook’s aim was to create a “hyper-efficient facility” to limit the use of water, energy and land as much as possible. In addition to running on 100 percent renewable energy with solar power playing a significant role, the new Singapore center will be the first to use the StatePoint™ Liquid Cooling system (SPLC). Developed in partnership with cooling specialist Nortek Air Solutions, the SPLC employs an evaporative cooling system featuring a “liquid-to-air energy exchanger.” The exchanger cools water while the water evaporates through a membrane separation layer. The cold water is then used to cool the air throughout the data center.

SPLC BLDG Schematic

Deployed on the rooftop, the SPLC units produce cold water, which is then supplied to the fan-coil wall (FCW) unit. These FCW units use the cold water supplied by the SPLC units to cool the servers.

In a Facebook blog post describing the new technology, Facebook thermal engineer Veerendra Mulay said, “ . . . the new cooling system will allow us to build highly energy- and water-efficient Facebook data centers in places where direct cooling is not feasible. Based on our testing for several different locations, we anticipate the SPLC system can reduce water usage by more than 20 percent for data centers in hot and humid climates and by almost 90 percent in cooler climates, in comparison with previous indirect cooling systems.”

“The system will not only protect our servers and buildings from environmental factors, but it will also eliminate the need for mechanical cooling in a wider range of climate conditions and provide additional flexibility for data center design, requiring less square footage in order to cool effectively,” Mulay continued.

Facebook SPLC Unit

Depending on external temperature and humidity levels, the SPLC unit operates in one of three (3) modes in order to optimize energy efficiency:

Adiabatic Mode:

  • Designed for warm temperatures.
  • Engages a heat exchanger that cools outside air before it enters the recovery coil to produce cold water.

Super-Evaporative Mode:

  • Designed for hot and humid environments.
  • Uses a separate, pre-cooling coil that cools outside air before it is processed.

Energy-Efficient Mode:

  • Designed for use in cool environments.
  • Circulates air through the membrane exchanger to produce cool water.

This environmentally adaptive technology enables Facebook to construct its data centers in harsher climates and not restrict its building sites to cooler geographic locales where it is easier to keep servers cool since the outside temperature is more moderate. With the SPLC system producing cold water rather than cold air, thermal restrictions are largely mitigated. The result is that the SPLC system provides Facebook with the latitude to now build data centers in even the most forbidding conditions. In so doing, Facebook will be able to provide services with lower latency to customers who live in close proximity to high levels of dust, oppressive humidity or elevated salinity (e.g., rural desert regions in the southwestern US).

SW US Desert

Whether you own a data center outright or utilize colocation offerings such as an Equinix data center, the stakes are raised on optimizing cooling innovation in rural desert regions.

In late August, Facebook also announced that it intends to power its global operations with 100 percent renewable energy by the end of 2020. The Singapore data center attests to that goal and the company’s effort to battle climate change. According to an article in Fortune about Facebook’s aforementioned renewable energy pledge, Facebook “. . . currently has contracts for more than three gigawatts of solar and wind energy, all of which are on the same grid as its data centers.”

With the current immense investment in data centers and heightened environmental concerns, data center cooling innovation is not only accelerating, but also taking on new dimensions. Microsoft, for example, recently deployed a data center at the European Marine Energy Centre located in the Orkney Islands, UK in June of 2018. The experimental data center is Phase 2 of a cutting-edge research project called Project Natick. Housing 12 racks containing 864 standard Microsoft data center servers with FPGA acceleration and 27.6 petabytes of disk, the Natick data center has enough storage for approximately five (5) million movies. The Natick Phase 1 vessel was operated on the seafloor approximately one kilometer off the U.S. Pacific coast in the latter half of 2015.

Project Natick Microsoft

Microsoft’s Project Natick seeks to understand the benefits and difficulties in deploying subsea data centers. Phase 2 of the project features deployment of a full-scale data center module in the North Sea, powered by renewable energy.

Not to be outdone, Google recently disclosed through its blog that it is using an artificial intelligence (AI) algorithm to actively manage the cooling systems of its global data centers. The move is part of an initiative by Google to utilize AI for the benefit of improved operational efficiency – an undertaking that began in earnest after its acquisition of DeepMind, a UK-based AI company, in 2014. Initially, DeepMind was relegated to offering cooling guidance and recommendations to data center operators (i.e., “AI-powered recommendation system.”) The operators would then take the data points and choose if and when to implement them. But after careful study, Google has decided to toss the keys to DeepMind and let it have complete control over cooling system management.

The dynamic graph below from Google plots AI performance over time relative to the historical baseline before DeepMind assumed control over the data center cooling function. Performance is measured by kW/ton (i.e., energy input per ton of cooling achieved). Over nine months, the DeepMind AI control system performance improved by nearly 30 percent.

Google Dynamic Graph

Graph courtesy of Google.

The available technologies and applications to achieve and sustain optimally efficient data center cooling are proliferating. Global technology pacesetters Facebook, Microsoft and Google are each demonstrating in their own way the plethora of approaches that data center operators can take.

What cooling options are you exploring for your data center? Contact us for a free consultation to fully assess your alternatives and make an informed decision to protect both your initial data center investment as well as the environment.