February 21, 2019
Everyone’s talking about machine learning these days. But what exactly is it? And will we be seeing more of it in the future? Well, we recently sat down with Raritan’s Technical Product Manager, Paul Mott, to discuss this growing trend and the potential effects it could have on the data center industry.
Question 1: What are your thoughts on machine learning and the role it will play in the data center space?
Machine learning has already had a profound impact on some of the larger enterprises that have the scale and resources to not only develop but deploy Machine Learning across their data center infrastructure. The most widely publicized example is Google and what they have done with their deep mind platform. They were able to reduce up to 40% of their energy consumption by letting the data center run on autopilot and make small tweaks that weren’t obvious to human operators but made a huge difference overall.
These new innovations in datacenter management have been around for some time, and it’s not only the large-scale enterprises that can benefit from this. There are numerous third-party platforms and services that can be used to analyze the data of power consumption, cooling and other environmental variables over a longer period to gather insights and recommend changes. Some can even automatically adjust cooling to match IT workloads to better utilize energy.
Eventually, the aperture will expand beyond just energy and cooling efficiency and we will start to see advances in predictive hardware failure, hardware lifecycle management, security, asset management and other fundamental tasks in the datacenter that haven’t been optimized yet. We are just beginning to scratch the surface on how Machine Learning can be used for industrial applications like the data center, and it’s really exciting to see what comes next.
Question 2: How is Raritan involved in machine learning?
Raritan lays the foundation for Machine Learning in the datacenter by providing operators the ability to monitor and collect data at a very granular level across all our power products. This is important because Machine Learning algorithms need a tremendous amount of training data to feed their neural networks. It really starts with the IT workload. Each workload will have a specific amount of power consumed and heat generated based on the hardware it’s running on. This may not easily be discovered by just monitoring at the panel/RPP alone as workloads can shift around. Instead, the data needs to be analyzed all the way down to the most fundamental measurement point, be it the server cabinet or even the individual server or device. Only then can the exact power tracking of workloads happen. This will lead to more precise training data and better overall optimization by the algorithms driving these Machine Learning systems.
And Raritan power products not only provide this key functionality, but they excel at generating and collecting data. With patented metering technology which is the most accurate in the industry, Raritan power products can generate significant amounts of data that is highly accurate. Each outlet or inlet on a PDU can generate up to six different data points, all of which are core power variables that data center operators care about. And the focus is not only on power, but environmental conditions as well. Every Raritan power product can accommodate up to 12 sensor packages so the air flow, temperature and humidity can also be monitored around IT equipment. The data generated around environmental variables is equally important. That’s why Raritan has invested in the design and development of these types of sensors alongside our PDU technology.
The Bottom Line
No one really knows what the future holds when it comes to the data center industry and machine learning. All we can do is be part of the conversation and continue to evolve.
To learn more about Raritan’s product offering, please visit our website here.