Understanding the difference between relative and absolute humidity in the data center
Monday, Aug 31st 2015

As with many modern electronics, servers and other pieces of data center equipment are very specific in terms of their environmental needs. This is especially true when talking about humidity.

As many IT professionals know, having an incorrect humidity level in the server room can seriously affect server uptime and company productivity. Although people might think any humidity in the server room is a bad thing, some water in the air is actually necessary to the proper functioning of IT equipment. While too much leads to condensation and eventual corrosion of the electrical components, too little humidity causes electrostatic buildup and can damage servers just as much as high humidity levels.

It is for this reason that the American Society of Heating, Refrigerating and Air-Conditioning Engineers created a guide for optimally storing IT equipment, stating that things like servers should be kept at a relative humidity between 40 and 55 percent. That being said, quite a lot of people within the data industry are questioning whether server room humidity should be based on relative or absolute humidity. 

While most people understand what humidity is, few actually know the difference between relative and absolute humidity. This can lead to disastrous results, as their definitions are by no means interchangeable. Knowing the difference between these two terms should provide a deeper understanding of what goes into the complex nature of server surveillance. 

Relative and absolute: The difference
Basically, the difference between these two terms boils down to the amount of water that is actually in the air versus the amount that could be. Absolute humidity deals with the amount of water vapor that is currently in a particular sample of air. This is usually measured in grams of water per kilogram of air.

However, the most important part of absolute humidity is something called a dew point. If the dew point for a particular day is 40 degrees Fahrenheit, then the water in the air will condense into dew when the temperature reaches 40 degrees. 

On the other side of this is relative humidity, which actually uses absolute humidity in its calculation. Relative humidity is the ratio of the actual amount of water in a sample to the amount needed to fully saturate the air with water vapor. So, if a particular sample of air has an RH of 40 percent, then that sample is 40 percent saturated with water. 

Why does it matter?
So with all of this in mind, the question arises as to why any of this matters in terms of server surveillance. Well, as it turns out, the IT storage industry seems to have been neglecting absolute humidity ranges for a long time now.

As stated above, ASHRAE has always had a relative humidity range for IT equipment. But it wasn't until 2008 that it decided to include absolute humidity ranges as well. The current accepted level for absolute humidity is a dew point around 41.9 degrees as long as the facility is following proper data center temperature protocols.

The reasoning for this inclusion is that relative humidity can fluctuate drastically with the temperature. During a cold winter month, the outside temperature might be a low 15 degrees F, but if the dew point is held high at 10 degrees, then the RH would be 80.2 percent. This percentage is considered extremely high, but the average person would not necessarily call this kind of a day "humid." Having a dew point as a part of these server surveillance guidelines fills in the entire picture of what is happening to IT equipment. 

And while all of this seems extremely complicated, thankfully companies like ITWatchDogs have server surveillance equipment specially designed to monitor all of these data points. The Watchdog 100 has a built-in sensor that allows facility managers to monitor both relative humidity and dew point, allowing them to have better information and more server uptime. 

Archives