Best Answer

Because it is in same units as the original data. For example, if you have a sample of lengths, all in centimetres, the sample variance will be in units of centrimetres2 which might be more difficult to interpret but the sample standard deviation with be in units of centimetres, which would be relatively easy to intepret with reference to the data.

Q: Why is standard deviation a better measure of dispersion than variance?

Write your answer...

Submit

Still have questions?

Continue Learning about Math & Arithmetic

1. Standard deviation is not a measure of variance: it is the square root of the variance.2. The answer depends on better than WHAT!

They are measures of the spread of distributions about their mean.

The variance or standard deviation.

How do we calculate variance

The Absolute Measure of dispersion is basically the measure of variation from the mean such as standard deviation. On the other hand the relative measure of dispersion is basically the position of a certain variable with reference to or as compared with the other variables. Such as the percentiles or the z-score.

Related questions

They are effectively the same but the standard deviation is more popular because the units of measurement are the same as those for the variable.

1. Standard deviation is not a measure of variance: it is the square root of the variance.2. The answer depends on better than WHAT!

standard deviation is best measure of dispersion because all the data distributions are nearer to the normal distribution.

Both variance and standard deviation are measures of dispersion or variability in a set of data. They both measure how far the observations are scattered away from the mean (or average). While computing the variance, you compute the deviation of each observation from the mean, square it and sum all of the squared deviations. This somewhat exaggerates the true picure because the numbers become large when you square them. So, we take the square root of the variance (to compensate for the excess) and this is known as the standard deviation. This is why the standard deviation is more often used than variance but the standard deviation is just the square root of the variance.

standard deviation is the square roots of variance, a measure of spread or variability of data . it is given by (variance)^1/2

The units of dispersion are dependent on the units of the data being measured. Common measures of dispersion include variance and standard deviation, which have square units and the same units as the data being measured, respectively. Another measure, such as the coefficient of variation, is a unitless measure of dispersion relative to the mean.

They are measures of the spread of distributions about their mean.

The variance or standard deviation.

No. The average of the deviations, or mean deviation, will always be zero. The standard deviation is the average squared deviation which is usually non-zero.

Variance

A measure of variation, also called a measure of dispersion, is a type of measurement that details how a set of data is scattered from a central or neutral point of origin. Range, variance and standard deviation are three measures of variation that are commonly used.

The standard deviation or volatility (square root of the variance) of returns.