Posted on February 25, 2019 by Jessica Ciesla
Since their inception, data centers seem to have been in a near constant state of evolution. As we approach the end of another decade, the next evolution cycle is nearing, and with it, traditional data centers must grapple with emerging technologies, environmental concerns, and ever-increasing costs. In fact, a recent Gartner study revealed that traditional data centers will turn to Artificial Intelligence (AI) and machine learning to combat the array of challenges that are looming. To support this claim, the study further suggested that approximately 30 percent of traditional data centers will fail to implement machine learning solutions, which will result in their failure by the year 2020.
Posted on February 21, 2019 by Jessica Ciesla
Everyone’s talking about machine learning these days. But what exactly is it? And will we be seeing more of it in the future? Well, we recently sat down with Raritan’s Technical Product Manager, Paul Mott, to discuss this growing trend and the potential effects it could have on the data center industry.
Question 1: What are your thoughts on machine learning and the role it will play in the data center space?
Machine learning has already had a profound impact on some of the larger enterprises that have the scale and resources to not only develop but deploy Machine Learning across their data center infrastructure. The most widely publicized example is Google and what they have done with their deep mind platform. They were able to reduce up to 40% of their energy consumption by letting the data center run on autopilot and make small tweaks that weren’t obvious to human operators but made a huge difference overall.
Posted on February 12, 2019 by Jessica Ciesla
Monitoring a data center shouldn't require continuous onsite visits or numerous web interfaces that slow down response times and complicate data analysis. In fact, effectively monitoring a data center is a task that can and should be optimized by leveraging best in class environment sensors and DCIM software solutions. Fortunately for data centers across the globe, Raritan has developed award winning SmartSensors that provide the insights that data managers need to understand real time environment data and analyze performance trends.
What Is A SmartSensor?
At its core, SmartSensor is Raritan's environmental monitoring platform. It has been strategically engineered for easy deployment, accurate data results, and heightened levels of insight into a data center's operational levels. Available as plug-and-play options for EMX rack controllers, PX Intelligent PDUs, PX inline meters, and branch circuit monitors, the SmartSensors alleviate the need for a separate controller. In fact, the information gathered is instantaneously sent to a DCIM software solution, so that data managers can gain vital insights into real time environmental data via a single web interface.
Through the SmartSensors, data center managers can react more effectively to any environmental hazards that are threatening the life or performance of mission critical IT assets. The ability to react more quickly and with greater knowledge of the potential threat, is one of the reasons that SmartSensor has achieved a best in class status.
Posted on February 7, 2019 by Jessica Ciesla
It's no secret that machine learning has infiltrated everything from the technology sector to the consumer market. From smartphone assistants to Google's Artificial Intelligence (AI) advancements, machine learning continues to help tackle the most challenging problems in the world. When it comes to data centers, the use of machine learning has led to incredible advances in efficient and reduced energy consumption.
Google's AI Leads To A Breakthrough In Energy Consumption
Google's DeepMind AI program has been used in their data centers to reduce the amount of energy used on a daily basis. The implications of this type of machine learning are potentially industry-changing. Take for example the amount of energy that is used to cool a data center. From cooling the servers to maintaining the optimal temperatures in a rapidly warming world, data centers require an astronomical level of energy to cool. Fortunately, AI programs, like DeepMind, are designed to not only reduce the amount of energy needed to cool a data center, but to also provide actionable insights into a global issue: climate changes.
Posted on January 30, 2019 by Jessica Ciesla
Content originally sources from IDC.
As discussed in Part 1 of this series, traditional data centers must prepare for future changes and challenges within the industry. These changes include the impacts of emerging technologies, higher costs, and environmental concerns. As a direct response to these changes, data centers will need to respond to three influential factors as they seek to increase operational efficiencies in the years to come.
Influential Factor Recap
In the first part of this series, we discussed the three key factors that will influence the future of data centers in the coming years. These three factors include: