Posted on August 10, 2010 by James Cerwinski | Comment (0)
If colder is not better and we agree you can save energy by increasing the temperature — why are data center managers still over-cooling? The reason is that many data center managers don’t have the information to safely increase the temperature. The following Gartner quote adds an important new element — the need to monitor for hot spots.
Gradually raise the temperature at the server inlet point to run up to 24 degrees Celsius (75 degrees Fahrenheit), but use sensors to monitor for hot spots - Gartner – 29 July 2009
ASHRAE recommends you measure and record temperature and humidity at the geometric center of the air intake of the top, middle and bottom racked equipment at 50 MM( 2in.) from the front of the equipment. For example, if there are 20 servers in a rack, measure the temperature at the center of the first, tenth or eleventh, and twentieth server as shown in the following picture.
Source: ASHRAE Thermal Guidelines for Data Processing Environment
Are you monitoring your temperature and humidity according to ASHRAE’s recommendation?
Posted on August 4, 2010 by Website Administrator | Comment (0)
Ok, it’s July 20th, 1969 and America had finally realized JFK’s dream and commitment that we would land a man on the moon and return him safely within the decade of the 60s. Everybody remembers Neil Armstrong’s first few steps, and his famous first few words. (Although there is still some controversy as to the actual words he used, even when listening to the audio tape). Neil was pioneering the manned lunar exploration era. He became synonymous with the space program and his role became one of leadership and understanding.
Eighteen minutes later, Buzz Aldrin walked down the same ladder, and walked on the same landscape, ultimately uttering the words, “Beautiful, beautiful. Magnificent desolation.” Buzz was walking the exact same steps, but was doing so in the context of Neil’s 18-minute old path. Who remembers this detail about Buzz (not just his fun name, but the fact that he was the second person to walk on the moon)?
Very few people do. In fact, in every new adventure mankind has embarked upon, it is the pioneers that are remembered. The first one to do something. The leaders who take ownership and set direction. Ultimately these pioneers are the leaders who set the tone in which all others must operate.
Posted on August 4, 2010 by James Cerwinski | Comment (0)
In my last post, I explained that “it is widely accepted that you can save energy by avoiding over cooling a data center.” That seems like a fair statement to accept, but have had one person question this statement — explaining that the fans on the servers might have to run more often.
Let us look at what ASHRAE and Gartner have to say on this.
ASHRAE’s takes the following position in their book titled “Best Practices for a Datacom Facility Energy Efficiency”
“Environmental conditions have a substantial impact on energy efficiency and total cost of ownership in a datacom facility.”
“Allowing for increased temperature and humidity dead bands will eliminate “fighting” between adjacent supply air units, which is a significant source of inefficiency in some existing facilities.”
David J. Cappuccio, Gartner managing vice president and chief of research for the Infrastructure teams has made the following position — “Data center managers can save up to 4 percent in energy costs for every degree of upward change in the baseline temperature, known as a set point. The higher set point means less frequent use of air conditioning, which saves the energy used to run cooling systems”
Tell me about your experiences.
Posted on August 2, 2010 by James Cerwinski | Comment (0)
Who is ASHRAE and how can they help data center mangers save energy?
The American Society of Heating, Refrigeration and Air-Conditioning (“ASHRAE”), founded in 1894, is an international organization with 51,000 persons. ASHRAE fulfills its mission of advancing heating, ventilation, air conditioning and refrigeration to serve humanity and promote a sustainable world through research, standards writing, publications and continuing education. One such publication is the “Best Practices for Datacom Facility Energy Efficiency” in which they define the recommended temperature and humidity level as measured at the inlet of datacom equipment. A valuable point to note is that the 2009 recommended upper limit temperature is 80.6 degrees F which is an increase of 3.6 degrees F over the 2004 recommendation. It is also important to note that the specific recommended range is defined more precisely on a Psychrometric chart in which each point is defined by the dry-bulb temperature and the relative humidity.
It is widely accepted that you can save energy by avoiding over cooling a data center. More on this in my next post.
Posted on July 28, 2010 by Allen Yang | Comment (0)
At Raritan’s data centers, we integrate network monitoring and management with console management and power management into a single global IT dashboard. From the IT dashboard, IT members can see the current hot issues and track their progresses, or click into the network monitoring system to see the details of topology map and network and servers health. What’s cool is that, with Dominion PX, the network monitor also displays environmental sensor information such as the temparature on the same screen, making it really convenient for IT members. The only glitch was that we initially forgot to make it clear that the temperature was reported in 1/10 Celcius units, I almost had a heart attack when I saw the temperature numbers for the first time.
The dashboard summarizes power consumption information for each data center; for details, IT administrators can then click through to access Power IQ. This is convenient enough for now; but we are thinking to correlate the server utilization with data center power consumption in the future. And we are waiting for the next release of Power IQ to calculate data center PUE numbers as we are now calculating that manually.
After we integrated network management with power management, we are currently integrating Raritan CC-SG and KVM into the same IT dashboard to provide the same one-click convenience of securely accessing servers and VMs through Raritan CC-SG/KVM devices. We can do this all because of Raritan’s support for open standards and popular AA services including SNMP, IPMI and Active Directory in its products. It feels really good to see that Raritan products integrate easily with popular network management tools.