CHARLOTTE, N.C. — It truly is one of Mother Nature’s rarest gifts: dry air in August. But before we unbox a delectable drop in humidity, let me settle a score I have with the way most people record it.
You see, we have two main ways of measuring humidity: the most commonly used of which is known as relative humidity. It uses a percentage that generally represents how close the dew point is to the outside temperature – that is, how close the air is to condensation.
But here’s why you shouldn’t use that. Relative humidity is misleading! For example, a 50 degree day in winter with a 40 degree dew point has a relative humidity of nearly 70%. Meanwhile, an average summer day in the Carolinas would have a temperature of 90 degrees with a dew point at 70 would only have an RH of 50%. Yet, most people would say the summer day *feels* much muggier.
The key here is that the DEW POINT has risen from 40 degrees all the way up to 70. The higher the dew point, the muggier it feels, no matter what the relative humidity says.
The dew point is an absolute measure of how much moisture is in the air.
Take a room 20 feet wide, 20 feet long, and 20 feet tall, and the air inside has a dew point of 50 degrees. If you wrung out all of the moisture in the room, it would give you about one water bottle.
Now, if you were to do the same thing at a dew point of 70 degrees, the moisture content — the amount of humidity you *feel* — doubles, despite no change in temperature.
Furthermore, the dew point can never rise above the actual temperature. This is why it very rarely ever feels humid in the winter, and why it feels muggy all the time in the summer.
Here’s the point: relative humidity factors in the actual temperature, which can create misleading data. The dew point alone is all you need: the higher it is, the muggier it feels, regardless of temperature.
© 2022 WCCB Charlotte's CW.