Benefits of boosting energy efficiency in the data center
Tuesday, Sep 1st 2015
With all of the "going green" initiatives that have gained so much popularity recently, it is interesting that more people are not pushing for greater energy efficiency in America's data centers. U.S. data centers account for about 2 percent of all electricity used in America, consuming more than 100 billion kilowatt-hours of electricity.
Although these numbers seem high, it isn't as though data centers are actively trying to have high energy needs. That's just the nature of the beast. GreenBiz reported that just one rack of blade servers can make the same amount of heat as four Weber gas grills in just an hour of server uptime. This just goes to show that servers require a lot of electricity to run, as well as a hefty amount of the stuff to keep server room temperature at an optimal level.
So with all of this in mind, it is easy to see the ecological impact of keeping connected in the modern world. And while thinking of the environment and limited resources is certainly a good thing, there is more to being energy efficient than saving the world.
Energy lost is money lost
Data centers, on average, spend a huge amount of money each year on the amount of energy that they consume. While a lot of this is necessary, a good amount is referred to as wasted energy. The Department of Energy and others use wasted energy to calculate power usage effectiveness. PUE is calculated by dividing the totaly energy consumed by the data center by the amount of energy consumed just by its IT resources, which gives a sample of how much energy is wasted. The current average PUE is about 2.0 for most data centers.
Per Brashers, founder of the big data infrastructure consultancy group Yttibrium, stated that "a 1 megawatt facility with a PUE of 1.90 spends more than $1 million on waste energy, whereas a facility with a PUE of 1.07 spends $148,000 on waste energy." It's quite clear through this example that energy efficiency is a great way to be cost effective on top of being good for the environment.
So how can data centers cut down on costs while improving energy efficiency?
Well, a good place to start is the cooling systems in place to keep servers from overheating. Based on the Weber grill example above, it's obvious that servers get extremely hot. Because these machines are so sensitive to environmental changes, temperature monitoring and proper cooling is a must for any data center.
That being said, a lot of data centers are increasing the temperature in the server room to cut costs. Data Center Knowledge stated that a lot of data centers are pushing their server room temperature up to 80 degrees Fahrenheit. It used to be an industry standard to keep the server room temperature as close to 72 degrees as possible. While this is still the most recommended and optimal temperature, certain data centers are pushing the limits to see decreases in cooling costs.
Data centers that wish to attempt to raise their server room temperature in order to save costs run the risk of overheating their equipment. Although there is a lot of money to be saved in even a few degrees above normal operating temperature, these savings are dwarfed by the amount of money necessary to fix or even completely replace an overheated server. Therefore, it is extremely important to invest in a state-of-the-art temperature sensor. The Watchdog 1000 has a built-in high temperature alarm that triggers a text message to be sent out if there is ever a server overheating situation that must be dealt with.