mobile
  • Select A Region
    Choose A Region
  • Login
  • About Us
 
Blog Support About Us Contact Us Raritan EU Raritan AP Raritan JP

The Raritan Blog

What Can We Learn From Microsoft’s Underwater Data Center Experiment?

Posted on March 23, 2016 by Michael Bord  |  Comment (0)

Project Natick has made serious waves in the data center industry over the past two months.  Natick is the name of Microsoft’s subsea data center research project.  The project aims to cut the costs of cooling modern infrastructure.  And, it may yield a service that offers content providers extra capacity in proximity of billions of end users.  So what can we learn from Project Natick that can be applied to data centers currently?

In an article from February 2016, James Vincent from The Verge wrote:

“Placing data centers underwater not only helps keep their contents cool, but also has logistical advantages. Microsoft points out that half of the world's population lives within 200 kilometers of the ocean, making subsea systems potentially easier to deploy when extra capacity is needed.”

But deploying a data center underwater clearly poses its share of challenges:

“Data centers on land are open for engineers to fix and replace servers whenever needed, but Microsoft wants its undersea systems to go without maintenance for years at a time. "We see this as an opportunity to field long-lived, resilient data centers that operate 'lights out' — nobody on site — with very high reliability for the entire life of the deployment, possibly as long as 10 years," says the company.”

Read more of Vincent’s article in the latest edition of the Hot Aisle.

Although the project is in its infancy, there are several takeaways for data center operators.  First, cooling costs are a serious concern. Since most data centers will continue to operate on land for the foreseeable future, it’s an issue that will need to be tackled sooner rather than later.   

A thermostat is rarely the best indicator of conditions at the rack.  So, data centers would be wise to invest in environmental sensors.  Sensors are capable of taking real-time readings that sync to DCIM monitoring software.  The data that’s amassed makes it easier to make smart decisions: Should certain high density equipment be grouped together? Should CRACs and CRAHs be adjusted? Etc…

The second takeaway is that data centers are becoming more widely distributed geographically.  This isn’t a shock given the rise of cloud computing, and edge and colo data centers over the last few years.  But, it does raise the important question of how best to administer equipment remotely.

There are many options to consider including RDP, VNC, and embedded service processors.  Although both software tools and ESPs offer many advantages, they can’t offer the reliability of a remote access tool like KVM.  For those planning a remote lights-out deployment; or an underwater one someday, this shouldn’t be ignored.

Will Project Natick change the data center industry? Maybe.  But for now, it’s in most data centers’ best interest to meet the challenges that exist today. 

 


Learn how companies like F5 monitor cooling in their data center environment.


Leave a Comment