Systolic blood pressure is typically considered to be measured on a ratio scale. Ratio scales have a true zero point, meaning that a value of zero indicates the absence of the quantity being measured. In the case of systolic blood pressure, a reading of 0 mmHg would indicate no pressure at all, making it a ratio scale measurement.
Chat with our AI personalities
Ratio.
All pressure readings are on the ratio scale. There is a starting point, atmospheric pressure. If there blood pressure increases by 10%, there is 10% more force being exerted.
I note a related one - temperature is a tricky one. If I have degrees C, then it is on the interval scale, but if I convert to degrees K, then it can be considered on the ratio scale, as there is a starting point, and a doubling K has meaning.
I'M NEW, AND DID NOT WANT TO REMOVE THE FIRST ANSWER, BUT I AM CERTAIN IT IS INCORRECT. HERE IS WHY:
Actually, the scale is INTERVAL, because of the above-mentioned fact that the starting point is the atmospheric pressure. Atmospheric pressure is not an absolute zero point. Say you are at the sea level, where normal air pressure is 1 atm=101.325kPa=760mmHg (millimeters of Mercury are usually used to report blood pressure), and your Systolic Blood Pressure is 68mmHg (so the absolute value is: 760mmHg of starting point + 68mmHg of your blood pressure = 828mmHg). Now, let's say your Systolic Blood Pressure jumped 10%, that is to 74.8mmHg (the new absolute value is: 760mmHg of the same starting point + 74.8mmHg of your new blood pressure = 834.8mmHg), and the absolute ratio is not maintained, i.e.: 834.8mmHg / 828mmHg = 1.008. That is the absolute increase is about 0.8%. The differences become even more significant at higher elevations, where the air pressure is lower.