Project Natick: Underwater Data Centers

Microsoft’s crazy idea to cool data centers for free.

Image Credit: MCKIBILLO via IEEE

To take advantage of freely available cool air, we’ve seen Google open a data center in Finland and Facebook open a data center in a Swedish city near the Arctic Circle. Now Microsoft wants to build one underwater to follow suit. Not only do they argue that this plan is feasible, they think it could also “reduce construction costs, make it easier to power these facilities with renewable energy, and even improve their performance,” according to Sean James, a Microsoft engineer.

Before validating James’s claims, putting a data center underwater comes with a myriad of challenges:

  1. The container must stay dry.
  2. Seawater should cool the servers efficiently.
  3. Containers must be free from barnacles and other sea life that may inhibit the cooling process.

Why go through all this trouble to build something underwater, when you can build it in Finland or Sweden like the other tech giants? To this end, James postulates several advantages other than the cooling factor.

First of all, the Microsoft team argues that building underwater pods avoids the hassle that comes with constructing data centers on land. Today, companies must deal with building codes, taxes, electricity supply, and network connectivity in other countries before putting in the rack of servers.

On the other hand, the ocean provides a relatively uniform environment for these underwater pods. These pods can be made almost on demand (instead of planning it out and negotiating with governments and landowners long before) and deploy at any coastal site with little customization. Microsoft’s goal is to deploy these pods within 90 days from purchase.

Image Credit: Spectrum IEEE

Microsoft also points to the remote locations of these data centers as a limitation in how fast these servers can respond to requests. They cite that almost half the world’s population lives within 100 kms of the ocean, concluding that bringing these pods closer to where we live will add speed benefits.

Lastly, Microsoft mentions that many data centers use evaporation to cool the surrounding air, consuming more water. In Microsoft’s case, they’ll be using surrounding water to transfer the heat from the air out, so it will not be “consuming” water in a non-renewable manner. Even at intermediate depths between 10–200m, the water remains between 14–18 °C, making it an ideal environment for cooling data centers.

In fact, Microsoft’s pilot pod, named Leona Philpot, successfully showed that the submerged pods could keep temperatures low using lower energy overhead than mechanical cooling or free-air approaches.

Image Credit: Spectrum IEEE

There’s still some work left to be done. While Natick’s current cooling mechanism is economical and efficient for standard servers, it might need more exotic approaches (using dielectric liquids or high-pressure helium gas) to cool down more intensive servers. Also, keeping off ocean creatures is a challenge as barnacles could disrupt the outflow of heat from these data centers.

Lastly, while Microsoft underplays the environmental impact, quoting that “the water just meters downstream of a Natick vessel would get a few thousandths of a degree warmer at most,” a long term study might be required to observe the effects of thermal pollution due to submerged pods.

If Microsoft figures out a way to safely deploy these pods without destroying the environment, this might be the most significant advance in data center cooling since DeepMind’s AI reduced its footprint by 40% last year.

You can read the original IEEE issue online or in print via “Dunking the Data Center” in the March 2017 issue.

Yitaek Hwang
Yitaek is the Director of R&D at Leverege who loves learning about IoT, machine learning, and artificial intelligence. He graduated from Duke University with a dual degree in electrical/computer and biomedical engineering and is a huge Cameron Crazie.