Rising PUEs highlight benefits of temperature monitoring equipment
Friday, Apr 26th 2013

Although major data center facilities can use humidity and temperature monitoring to lower annual energy bills, new research shows that the average power usage effectiveness of a data center has gone up over the past few years.

In the data center world, PUE is the prime indicator of how efficient a facility is, with the number reflecting a ratio of how much power enters a location versus how much electricity is used by the hardware inside. Although PUE measurements are not always exact, the ideal ratio is 1 - this would indicate that the data center is using all incoming electricity for powering servers, TechTarget reported.

However, a new survey conducted by data center solutions provider Digital Realty of 300 IT staff members working at large U.S. corporations found that their data centers are far from the ideal ratio. IDG News reported that the average PUE was 2.9. In comparison, a 2011 survey of 500 data centers found average PUEs to be around 1.8, the U.S. Environmental Protection Agency in 2009 pegged average PUE at 1.91, Data Center Knowledge reported.

"While a PUE of 2.9 seems terribly inefficient, we view it as more being closer to the norm than the extremely low (close to 1) figures reported in the media," Digital Realty CTO Jim Smith to IDG News. "In our view, those [lower] figures represent what a very small number of organizations can achieve based on a unique operating model."

How to improve data center efficiency
One of the reasons why major corporations like Google can have a data center PUE close to 1 is because the facilities are relatively new and built with energy efficiency in mind. In contrast, most data centers are still in older buildings and were not created with PUE optimization in mind, according to Digital Realty.

For one, most operations are designed to minimize downtime and not electricity usage. As a result, many data centers have great redundancy built into them. Often, extra storage hardware or other equipment will use electricity not for accomplishing mission-critical tasks, but rather only to ensure business continuity by reducing potential downtime and latency as much as possible, according to IDG.

In addition, the survey found temperature monitoring to still be a primary energy hog in data center environments. In part because many companies house older equipment in buildings not designed for computer rooms, the amount of electricity required to keep facilities at ideal conditions is still high.

However, part of the problem may be that data center operators are letting outdated ideas about eliminating downtime negatively affect temperature monitoring strategies. According to Energy Star, server inlet temperature ideally should be between 65 degrees Fahrenheit and 80 degrees F. Nevertheless, some facilities managers keep the computer room temperature at 55 F to be especially sure that hardware does not break down from excess heat.

The problem with this arrangement is that it requires companies to spend more money than is necessary on cooling. Energy Star reported that for every 1 degrees above 55 F a data center is set at, organizations can expect to reduce their annual energy bills by up to 5 percent. To make sure that the computer room temperature does not become too warm, facilities managers should leveraged advanced temperature monitoring equipment that can instantly alert them if and when conditions deviate from the ideal.