Humidity has been a part of weather forecasts for as long as we’ve gotten our news over the air. At the beginning of most weather forecasts, our friendly neighborhood weatherperson tells us the sky conditions at the moment, the current temperature, and the relative humidity. Over the past couple of decades, though, the relative humidity has started to fall by the wayside in favor of the dew point. The dew point is a much more useful measure of how much moisture is in the air, but how does it relate to relative humidity?
The amount of water vapor in the air can dictate what kind of weather we see and how comfortable we are once we step outside. Relative humidity is technically defined as the air’s vapor pressure divided by its equilibrium vapor pressure. Equilibrium vapor pressure means that “there is no net evaporation or condensation,” according to Alistair Fraser, professor emeritus of meteorology at Penn State. At the equilibrium, otherwise known as the saturation point, water molecules are entering and leaving the condensed state at the same rate. When the relative humidity is cited as 50 percent, that means that the air is halfway to its saturation point, and that net evaporation is occurring. Warm air requires more water vapor than cool air to reach its saturation point, which is why an 85°F afternoon can get much muggier than a day that only makes it to 50°F—the latter can still be humid, sure, but it’s not like walking into a sauna.
The dew point is the temperature to which the air needs to cool down to in order to become completely saturated, or reach 100 percent relative humidity. Once the air temperature cools below its dew point, water vapor in the atmosphere will condense. This causes the relative humidity to go up and down like a roller coaster during the day. The relative humidity will go up at night when the air temperature approaches the dew point, and the relative humidity will go down as the air temperature warms farther and farther away from the dew point during the day.
The dew point is a little more abstract than the relative humidity, but it’s an effective way of telling you how much moisture is present in the air because it means the same thing no matter how warm or cold it is outside. A 40°F dew point is comfortable whether the air temperature is 60°F or 100°F. This consistency allows us to index the dew point to comfort levels, giving us a quick understanding of how muggy or pleasant it is outside.
It’s downright dry outside when the dew point is at or below the freezing point. Dew point readings between the freezing mark and about 55°F are pretty comfortable. A dew point between 55°F and 60°F is noticeably humid. It’s muggy when the dew point is above 60°F, and it’s uncomfortable outside when it ticks above 65°F. Any dew point readings above 70°F are oppressive and even dangerous, the kind of stickiness you experience in the tropics or during a brutal summer heat wave. It’s rare for the dew point to reach 80°F, but it can happen in extremely moist areas like corn fields or certain tropical areas.
The dew point and relative humidity are closely related, but the former is much more useful than the latter. Relative humidity helps meteorologists predict conditions favorable for wildfires and fog. Other than that, it’s mostly a relic of the old days that show up in weather reports out of habit. If you want to know the true measure of how comfortable or muggy it is outside, take a look at the dew point.