If you follow weather forecasts, you’ve heard about “relative humidity” (RH). But it’s one of those maddeningly less-than-useful measures of our weather that probably needs to be just retired. That’s wishful thinking, of course. But in an interesting introduction to their latest research into the increasing dryness of the air and the risk of fire that attends thereto, Richard Seager and his colleagues make another plea. Paraphrasing a 1936 paper by D.B. Anderson, they write:
Anderson (1936) points out that RH is not an absolute measure but merely a ratio of two known quantities expressed as a percentage.
If you can do the math quickly in your head, you can keep an intuitive grasp of the meaning of RH in a given situation. But using a measure that requires your audience to do math in their head to make sense of what you’re telling them is a bad communication strategy. Riffing off of Anderson, Seager and colleagues argue for the importance of a different measure that requires no such math – “vapor pressure deficit”. I’ll skip their equations for this:
VPD gives an absolute measure of the atmospheric moisture state independent of temperature. For example, for a given wind speed and atmospheric stability, above a surface that is not water-limited, a specific VPD leads to the same rate of evaporation, regardless of temperature.
Why should we care? Because vapor pressure deficits are rising in the southwestern United States, and are closely linked to wildfire risk. The public communication element of “VPD” vs. “RH” is really just a sidelight to an important new paper about rising fire risk as the southwest warms. (In particular they look in detail at VPD and the Rodeo-Chediski and Hayman fires.) But I found it intriguing. I’d love to have it added to my daily forecast page.