Skip navigation.
Home
uk.sci.weather resources

What's the difference between Humidity and Relative Humidity?

Absolute Humidity, often just referred to as 'the humidity', is a measure of the actual amount of water vapour in a particular sample of air: measured as a partial pressure (vapour pressure/hPa or millibars); a mixing ratio (gm water vapour/kg of dry air), dew point etc.

Relative Humidity - expressed commonly as a percentage value, is the ratio of the actual amount of water vapour present in a sample (the Absolute Humidity) to that amount that would be needed to saturate that particular sample.

The two terms are not interchangeable and can lead to confusion; e.g. on a cold, raw winter's day close to the east coast of England, the dew point might be 1 degC and an air temperature of just 2 degC. This would give a RH of 93%; a 'high' Relative Humidity, yet few would refer to such conditions as 'humid'. Conversely, on a hot summer's day, with a dew point of 18 degC, and an afternoon temperature of 30 degC, that's a RH of 49%; a 'low' Relative Humidity, but high Absolute Humidity.