answersLogoWhite

0


Best Answer

No. Because standard deviation is simply the square root of the variance, their information content is exactly the same.

User Avatar

Wiki User

11y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Does variance provide more information than standard deviation?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

How do you calculate probability given mean and standard deviation?

The mean and standard deviation do not, by themselves, provide enough information to calculate probability. You also need to know the distribution of the variable in question.


How standard deviation and Mean deviation differ from each other?

There is 1) standard deviation, 2) mean deviation and 3) mean absolute deviation. The standard deviation is calculated most of the time. If our objective is to estimate the variance of the overall population from a representative random sample, then it has been shown theoretically that the standard deviation is the best estimate (most efficient). The mean deviation is calculated by first calculating the mean of the data and then calculating the deviation (value - mean) for each value. If we then sum these deviations, we calculate the mean deviation which will always be zero. So this statistic has little value. The individual deviations may however be of interest. See related link. To obtain the means absolute deviation (MAD), we sum the absolute value of the individual deviations. We will obtain a value that is similar to the standard deviation, a measure of dispersal of the data values. The MAD may be transformed to a standard deviation, if the distribution is known. The MAD has been shown to be less efficient in estimating the standard deviation, but a more robust estimator (not as influenced by erroneous data) as the standard deviation. See related link. Most of the time we use the standard deviation to provide the best estimate of the variance of the population.


What is the sample variance with the mean of 190.3?

The mean, by itself, does not provide sufficient information to make any assessment of the sample variance.


Why to take square in the formula of standard deviation?

The sum of deviations from the mean will always be 0 and so does not provide any useful information. The absolute deviation is one solution to tat, the other is to take the square - and then take a square root.


What is one standard deviation above the mean 700?

The mean alone is not enough to provide an answer.


What is the Disadvantage of means deviation?

The disadvantage is that the mean deviation of a set of data is always zero and so does not provide any useful information.


What does standard deviation provide when measuring the range of possible outcomes of a distribution?

It is a measure of the spread of the outcomes around the mean value.


Does the variance provide a succinct summary of raw data?

Your question is a bit difficult to answer, as "succinct" is usually a quality in reference to a description or explanation. It is defined by Webster's dictionary as "marked by compact precise expression without wasted words." See related link. For this reason, I have reworded your question as follows: Does the variance fully describe or summarize the raw data? The answer is no. For any set of data, many statistical measures can be calculated, including the mean and variance. The variance or more commonly the square of the variance (standard deviation) is a very useful in identifying the dispersion of data, but is incomplete in fully describing the data. The mean is also important. Graphs can improve the summarization of data in a more visual manner.


If the mean is 1050 and the standard deviation is 218 what is the conpsumption level separating the bottom 45 percent from the top 55 percent?

The answer will depend on what the distribution is! And since you have not bothered to share that crucial bit of information, I cannot provide a more useful answer.


How does a sample size impact the standard deviation?

If I take 10 items (a small sample) from a population and calculate the standard deviation, then I take 100 items (larger sample), and calculate the standard deviation, how will my statistics change? The smaller sample could have a higher, lower or about equal the standard deviation of the larger sample. It's also possible that the smaller sample could be, by chance, closer to the standard deviation of the population. However, A properly taken larger sample will, in general, be a more reliable estimate of the standard deviation of the population than a smaller one. There are mathematical equations to show this, that in the long run, larger samples provide better estimates. This is generally but not always true. If your population is changing as you are collecting data, then a very large sample may not be representative as it takes time to collect.


What is rate measure and calculation of errors?

Standard error (statistics)From Wikipedia, the free encyclopediaFor a value that is sampled with an unbiased normally distributed error, the above depicts the proportion of samples that would fall between 0, 1, 2, and 3 standard deviations above and below the actual value.The standard error is a method of measurement or estimation of the standard deviation of the sampling distribution associated with the estimation method.[1] The term may also be used to refer to an estimate of that standard deviation, derived from a particular sample used to compute the estimate.For example, the sample mean is the usual estimator of a population mean. However, different samples drawn from that same population would in general have different values of the sample mean. The standard error of the mean (i.e., of using the sample mean as a method of estimating the population mean) is the standard deviation of those sample means over all possible samples (of a given size) drawn from the population. Secondly, the standard error of the mean can refer to an estimate of that standard deviation, computed from the sample of data being analyzed at the time.A way for remembering the term standard error is that, as long as the estimator is unbiased, the standard deviation of the error (the difference between the estimate and the true value) is the same as the standard deviation of the estimates themselves; this is true since the standard deviation of the difference between the random variable and its expected value is equal to the standard deviation of a random variable itself.In practical applications, the true value of the standard deviation (of the error) is usually unknown. As a result, the term standard error is often used to refer to an estimate of this unknown quantity. In such cases it is important to be clear about what has been done and to attempt to take proper account of the fact that the standard error is only an estimate. Unfortunately, this is not often possible and it may then be better to use an approach that avoids using a standard error, for example by using maximum likelihood or a more formal approach to deriving confidence intervals. One well-known case where a proper allowance can be made arises where Student's t-distribution is used to provide a confidence interval for an estimated mean or difference of means. In other cases, the standard error may usefully be used to provide an indication of the size of the uncertainty, but its formal or semi-formal use to provide confidence intervals or tests should be avoided unless the sample size is at least moderately large. Here "large enough" would depend on the particular quantities being analyzed (see power).In regression analysis, the term "standard error" is also used in the phrase standard error of the regression to mean the ordinary least squares estimate of the standard deviation of the underlying errors.[2][3]


How many digits are are in a standard us?

The answer will depend on a standard US WHAT! But since you have not bothered to share that crucial bit of information, I cannot provide a more useful answer.