Even as the data center becomes an increasingly critical part of business infrastructure, many organizations continue to rely on outdated facilities that may not have been designed for holding computing equipment at all, let alone optimizing performance for next-generation server hardware. As a result, data center efficiency for most companies falls far below the frequently publicized performance metrics of industry leaders that suggest major innovations. Yet with improved environmental monitoring capabilities offering more data than ever, companies may no longer have an excuse for avoiding change.
The state of data center efficiency
While Facebook and Google are closing in on power usage effectiveness scores of close to 1.0, meaning that their data centers barely require additional power to keep servers cool, the industry average is much higher, according to a recent study from wholesale data center provider Digital Realty. The company found that the average North American data center has a PUE of 2.9, with just 20 percent of businesses achieving a score of less than 2.0. Additionally, 9 percent had a PUE of 4.0 or higher.
"While a PUE of 2.9 seems terribly inefficient, we view it as more being closer to the norm than the extremely low (close to 1) figures reported in the media," Digital Realty CTO Jim Smith told PC World in an email. "In our view, those figures represent what a very small number of organizations can achieve based on a unique operating model."
According to Patrick Flynn, lead sustainability strategist at modular data center company IO, those gross inefficiencies are due to outdated design approaches and an aversion to innovation in the data center world. In a column for Wired, Flynn referred to the mindset that treats the data center as a massive construction project leading to a resource-draining product as "Data Center 1.0."
Moving toward Data Center 2.0
This traditional approach hews to established thinking that treats the data center much like any other commercial real estate, rather than conceiving of the data center as a technological investment that needs innovation just like any other part of IT. With today's environmental monitoring technology providing new levels of oversight, data center operators can improve their planning and stop relying on conventional wisdom about what has worked in the past, Flynn wrote. Rather than making massive infrastructure investments to protect against problems that never arise, companies can see where they need to focus resources in the data center.
"Many engineering re-designs don't make it from schematic drawing to implementation because designers don't have the data to prove a new approach is better than the old one," Flynn explained. "Real-time visibility will give data center managers the ammunition to rethink their planning, sourcing, delivery and operating models, and will help designers turn their dreams into reality. No longer will an industry that lights all other categories be left in the dark."
Flynn added that the "if it ain't broke, don't fix it" mentality can no longer be applied to data centers, as companies that think this way are simply passing up efficiency savings. A well-planned data center armed with operational data can reduce total operating costs by 70 percent and requirements for space, power, cooling and labor by 80 percent.
"Sustainability and energy efficiency are key drivers of the improvement engine," Flynn wrote. "Just like cars, refrigerators and cell phones, data centers improve in performance from one generation to the next."
Using tools such as the temperature sensors and humidity monitors offered by ITWatchDogs, businesses can improve operational oversight and treat their data center as a constantly evolving part of the IT stack rather than as an unwieldy real estate project to build and then ignore.