Most folks look at the humidity expressed as a percentage – known as Relative Humidity, or just RH. But, weather nerds and professionals rely on the dew point as a more accurate indicator of atmospheric moisture. This morning’s reading presented a good example of why RH is really a bogus number when it comes to expressing the humidity.
This morning’s air temperature was 57 degrees. The dew point was 56. That made the relative humidity 99%.
Yet, even with the humidity at 99%, it felt refreshing outside and not muggy at all. So, the 99% humidity reading is not really a good indicator of how humid the air is. It’s very pleasant outside.
Dew points are a much better indicator of how humid or uncomfortable the air is. When you use dew points, the air doesn’t start to become noticeably muggy until dew points approach 65°. When dew points get to the 70s, it starts to feel oppressively humid.
That’s why most weather folks express humidity by measuring the dew point and not relative humidity.