answersLogoWhite

0


Best Answer

1. Standard deviation is not a measure of variance: it is the square root of the variance.

2. The answer depends on better than WHAT!

User Avatar

Wiki User

โˆ™ 2014-05-11 14:37:07
This answer is:
User Avatar
Study guides

Algebra

20 cards

A polynomial of degree zero is a constant term

The grouping method of factoring can still be used when only some of the terms share a common factor A True B False

The sum or difference of p and q is the of the x-term in the trinomial

A number a power of a variable or a product of the two is a monomial while a polynomial is the of monomials

โžก๏ธ
See all cards
3.8
โ˜†โ˜…โ˜†โ˜…โ˜†โ˜…โ˜†โ˜…โ˜†โ˜…
1499 Reviews

Add your answer:

Earn +20 pts
Q: Why standard deviation is better measure of variance?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Math & Arithmetic

Why is standard deviation a better measure of dispersion than variance?

Because it is in same units as the original data. For example, if you have a sample of lengths, all in centimetres, the sample variance will be in units of centrimetres2 which might be more difficult to interpret but the sample standard deviation with be in units of centimetres, which would be relatively easy to intepret with reference to the data.


When it comes to comparing data from different distributions what is the benefit of normal standard distribution?

There may or may not be a benefit: it depends on the underlying distributions. Using the standard normal distribution, whatever the circumstances is naive and irresponsible. Also, it depends on what parameter you are testing for. For comparing whether or not two distributions are the same, tests such as the Kolmogorov-Smirnov test or the Chi-Square goodness of fit test are often better. For testing the equality of variance, an F-test may be better.


Why is variance important when using statistics?

Variance is basically the raw material of statistics. If you don't have variance (differences in scores) you don't have much to work with or for that matter you don't have much to talk or think about. Consider a test where everyone gets the same score. What does that tell you? You might have some measurement problem, wherein the test is so easy everyone aces it. Still it might be so hard that everyone gets a zero. Now consider two tests. On each everyone gets the same score. That is on test one everyone gets a 15 and on the second test everyone gets a 10. That isn't telling you much is it? Now these are extreme cases, but in general, more variance is better and less variance isn't so good.


Why do we need the quadratic mean How is it better than the average mean?

The quadratic mean is a measure of the spread of values about their arithmetic mean. By definition, the arithmetic mean of the differences will be zero and so adds no information. Another measure is required and that is the quadratic mean.


What advantages does a tape measure have over a measuring cord?

Small physical size when tape is rolled up - most are 5 meters or more long, 5 meters of cord is a big pile. A tape measure is stiff, a cord will stretch so accuracy is better, you can hook the end of a tape measure over it, walk all the way to the end to measure, a cord you would have to fit it (nail/tape) at one end first, so ease of use.

Related questions

Why is standard deviation better than variance?

Better for what? Standard deviation is used for some calculatoins, variance for others.


Why is standard deviation a better measure of dispersion than variance?

Because it is in same units as the original data. For example, if you have a sample of lengths, all in centimetres, the sample variance will be in units of centrimetres2 which might be more difficult to interpret but the sample standard deviation with be in units of centimetres, which would be relatively easy to intepret with reference to the data.


What is a better measure of variability range or standard deviation?

The standard deviation is better since it takes account of all the information in the data set. However, the range is quick and easy to compute.


Which is a better measurement of risk -- standard deviation or coefficient of variation?

The Standard deviation is an absolute measure of risk while the coefficent of variation is a relative measure. The coefficent is more useful when using it in terms of more than one investment. The reason being that they have different returns on average which means the standard deviation may understate the actual risk or overstate depending.


When is a t test better than a z score?

When you don't have the population standard deviation, but do have the sample standard deviation. The Z score will be better to do as long as it is possible to do it.


Is the coefficient of variation a better measure of risk than the standard deviation if the expected returns of the securities being compared differ significantly?

true


Which is better a score of 92 on a test with a mean of 71 and a standard deviation of 15 or a score of 688 on a test with a mean of 493 and a standard deviation of 150?

score of 92


What is mean deviation and why is quartile deviation better than mean deviation?

What is mean deviation and why is quartile deviation better than mean deviation?


How is better to calculate your IQ with the basis of deviation 15 or with the basis of deviation 24?

deviation 15 is better


Why do you not take the sum of absolute deviations?

You most certainly can. The standard deviation, however, has better statistical properties.


Give an example of how standard deviation can be useful Also why is underestimating the standard deviation as in the case with the Range Rule Thumb a better method than overestimating?

to calculate the standard deviation you must put each number in order from the least to the gr east then you must find your mean after you find your mean you must subtract your mean from each of the data set numbers once you finishsubtracting the data set numbers you add them up and divide by the amount of numbers there are and you have found the standard deviation.


Is a 113 IQ average for 16-year-old female?

100 is the average IQ whatever your age. A variance of plus or minus 10 points is considered within expected deviation. 100 is the average IQ, but it does vary on age. An IQ of a 113 for an adult is above average, so this should be slightly better for someone younger. 140 (Standard deviation of 20) or above is considered eligible for MENSA.

People also asked