answersLogoWhite

0


Best Answer

Better for what? Standard deviation is used for some calculatoins, variance for others.

User Avatar

Wiki User

11y ago
This answer is:
User Avatar
More answers
User Avatar

Anonymous

Lvl 1
3y ago

joe

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Why is standard deviation better than variance?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Statistics

Why is the standard deviation used more frequently than the variance?

The standard deviation has the same measurement units as the variable and is, therefore, more easily comprehended.


How do you calculate salary variance?

I believe you are interested in calculating the variance from a set of data related to salaries. Variance = square of the standard deviation, where: s= square root[sum (xi- mean)2/(n-1)] where mean of the set is the sum of all data divided by the number in the sample. X of i is a single data point (single salary). If instead of a sample of data, you have the entire population of size N, substitute N for n-1 in the above equation. You may find more information on the interpretation of variance, by searching wikipedia under variance and standard deviation. I note that an advantage of using the standard deviation rather than variance, is because the standard deviation will be in the same units as the mean.


When is a t test better than a z score?

When you don't have the population standard deviation, but do have the sample standard deviation. The Z score will be better to do as long as it is possible to do it.


What is the relationship between the mean and standard deviation in statistics?

The 'standard deviation' in statistics or probability is a measure of how spread out the numbers are. It mathematical terms, it is the square root of the mean of the squared deviations of all the numbers in the data set from the mean of that set. It is approximately equal to the average deviation from the mean. If you have a set of values with low standard deviation, it means that in general, most of the values are close to the mean. A high standard deviation means that the values in general, differ a lot from the mean. The variance is the standard deviation squared. That is to say, the standard deviation is the square root of the variance. To calculate the variance, we simply take each number in the set and subtract it from the mean. Next square that value and do the same for each number in the set. Lastly, take the mean of all the squares. The mean of the squared deviation from the mean is the variance. The square root of the variance is the standard deviation. If you take the following data series for example, the mean for all of them is '3'. 3, 3, 3, 3, 3, 3 all the values are 3, they're the same as the mean. The standard deviation is zero. This is because the difference from the mean is zero in each case, and after squaring and then taking the mean, the variance is zero. Last, the square root of zero is zero so the standard deviation is zero. Of note is that since you are squaring the deviations from the mean, the variance and hence the standard deviation can never be negative. 1, 3, 3, 3, 3, 5 - most of the values are the same as the mean. This has a low standard deviation. In this case, the standard deviation is very small since most of the difference from the mean are small. 1, 1, 1, 5, 5, 5 - all the values are two higher or two lower than the mean. This series has the highest standard deviation.


Can the mean be less than the standard deviation?

In general, a mean can be greater or less than the standard deviation.

Related questions

Why standard deviation is better measure of variance?

1. Standard deviation is not a measure of variance: it is the square root of the variance.2. The answer depends on better than WHAT!


Can the Variance ever be smaller than standard deviation?

Yes. If the variance is less than 1, the standard deviation will be greater that the variance. For example, if the variance is 0.5, the standard deviation is sqrt(0.5) or 0.707.


Why variance is bigger than standard deviation?

The variance is standard deviation squared, or, in other terms, the standard deviation is the square root of the variance. In many cases, this means that the variance is bigger than the standard deviation - but not always, it depends on the specific values.


Does variance provide more information than standard deviation?

No. Because standard deviation is simply the square root of the variance, their information content is exactly the same.


Why standard deviation is more often used than variance?

Both variance and standard deviation are measures of dispersion or variability in a set of data. They both measure how far the observations are scattered away from the mean (or average). While computing the variance, you compute the deviation of each observation from the mean, square it and sum all of the squared deviations. This somewhat exaggerates the true picure because the numbers become large when you square them. So, we take the square root of the variance (to compensate for the excess) and this is known as the standard deviation. This is why the standard deviation is more often used than variance but the standard deviation is just the square root of the variance.


Why is the standard deviation used more frequently than the variance?

The standard deviation has the same measurement units as the variable and is, therefore, more easily comprehended.


Why is standard deviation a better measure of dispersion than variance?

Because it is in same units as the original data. For example, if you have a sample of lengths, all in centimetres, the sample variance will be in units of centrimetres2 which might be more difficult to interpret but the sample standard deviation with be in units of centimetres, which would be relatively easy to intepret with reference to the data.


How do you calculate salary variance?

I believe you are interested in calculating the variance from a set of data related to salaries. Variance = square of the standard deviation, where: s= square root[sum (xi- mean)2/(n-1)] where mean of the set is the sum of all data divided by the number in the sample. X of i is a single data point (single salary). If instead of a sample of data, you have the entire population of size N, substitute N for n-1 in the above equation. You may find more information on the interpretation of variance, by searching wikipedia under variance and standard deviation. I note that an advantage of using the standard deviation rather than variance, is because the standard deviation will be in the same units as the mean.


Why the standard deviation of a set of data will always be greater than or equal to 0?

Because it is defined as the principal square root of the variance.


When is a t test better than a z score?

When you don't have the population standard deviation, but do have the sample standard deviation. The Z score will be better to do as long as it is possible to do it.


What is the relationship between the mean and standard deviation in statistics?

The 'standard deviation' in statistics or probability is a measure of how spread out the numbers are. It mathematical terms, it is the square root of the mean of the squared deviations of all the numbers in the data set from the mean of that set. It is approximately equal to the average deviation from the mean. If you have a set of values with low standard deviation, it means that in general, most of the values are close to the mean. A high standard deviation means that the values in general, differ a lot from the mean. The variance is the standard deviation squared. That is to say, the standard deviation is the square root of the variance. To calculate the variance, we simply take each number in the set and subtract it from the mean. Next square that value and do the same for each number in the set. Lastly, take the mean of all the squares. The mean of the squared deviation from the mean is the variance. The square root of the variance is the standard deviation. If you take the following data series for example, the mean for all of them is '3'. 3, 3, 3, 3, 3, 3 all the values are 3, they're the same as the mean. The standard deviation is zero. This is because the difference from the mean is zero in each case, and after squaring and then taking the mean, the variance is zero. Last, the square root of zero is zero so the standard deviation is zero. Of note is that since you are squaring the deviations from the mean, the variance and hence the standard deviation can never be negative. 1, 3, 3, 3, 3, 5 - most of the values are the same as the mean. This has a low standard deviation. In this case, the standard deviation is very small since most of the difference from the mean are small. 1, 1, 1, 5, 5, 5 - all the values are two higher or two lower than the mean. This series has the highest standard deviation.


Can standard deviation be greater than mean?

Standard deviation can be greater than the mean.