Best Answer

Standard deviation is a measure of the dispersion of the data. When the standard deviation is greater than the mean, a coefficient of variation is greater than one. See: http://en.wikipedia.org/wiki/Coefficient_of_variation If you assume the data is normally distributed, then the lower limit of the interval of the mean +/- one standard deviation (68% confidence interval) will be a negative value. If it is not realistic to have negative values, then the assumption of a normal distribution may be in error and you should consider other distributions. Common distributions with no negative values are gamma, log normal and exponential.

Study guides

☆

Q: How does one interpret a standard deviation which is more than the mean?

Write your answer...

Submit

Still have questions?

Continue Learning about Statistics

It is a measure of the spread of the distribution. The greater the standard deviation the more variety there is in the observations.

Standard deviation calculation is somewhat difficult.Please refer to the site below for more info

The standard deviation has the same measurement units as the variable and is, therefore, more easily comprehended.

The more precise a result, the smaller will be the standard deviation of the data the result is based upon.

The standard deviation of a distribution is the average spread from the mean (average). If I told you I had a distribution of data with average 10000 and standard deviation 10, you'd know that most of the data is close to the middle. If I told you I had a distrubtion of data with average 10000 and standard deviation 3000, you'd know that the data in this distribution is much more spread out. dhaussling@gmail.com

Related questions

B because the spread, in this case standard deviation, is larger.

The standard deviation is a number that tells you how scattered the data are centered about the arithmetic mean. The mean tells you nothing about the consistency of the data. The lower standard deviation dataset is less scattered and can be regarded as more consistent.

If I have understood the question correctly, despite your challenging spelling, the standard deviation is the square root of the average of the squared deviations while the mean absolute deviation is the average of the deviation. One consequence of this difference is that a large deviation affects the standard deviation more than it affects the mean absolute deviation.

Standard Deviation tells you how spread out the set of scores are with respects to the mean. It measures the variability of the data. A small standard deviation implies that the data is close to the mean/average (+ or - a small range); the larger the standard deviation the more dispersed the data is from the mean.

Without more information, it isn't possible to determine the standard deviation from the mean alone.

A large standard deviation means that the data were spread out. It is relative whether or not you consider a standard deviation to be "large" or not, but a larger standard deviation always means that the data is more spread out than a smaller one. For example, if the mean was 60, and the standard deviation was 1, then this is a small standard deviation. The data is not spread out and a score of 74 or 43 would be highly unlikely, almost impossible. However, if the mean was 60 and the standard deviation was 20, then this would be a large standard deviation. The data is spread out more and a score of 74 or 43 wouldn't be odd or unusual at all.

If "standard" is meant to be standard deviation, the answer is the second.

It is not. And that is because the mean deviation of ANY variable is 0 and you cannot divide by 0.

You need more than one number to calculate a standard deviation, so 9 does not have a standard deviation.

There is 1) standard deviation, 2) mean deviation and 3) mean absolute deviation. The standard deviation is calculated most of the time. If our objective is to estimate the variance of the overall population from a representative random sample, then it has been shown theoretically that the standard deviation is the best estimate (most efficient). The mean deviation is calculated by first calculating the mean of the data and then calculating the deviation (value - mean) for each value. If we then sum these deviations, we calculate the mean deviation which will always be zero. So this statistic has little value. The individual deviations may however be of interest. See related link. To obtain the means absolute deviation (MAD), we sum the absolute value of the individual deviations. We will obtain a value that is similar to the standard deviation, a measure of dispersal of the data values. The MAD may be transformed to a standard deviation, if the distribution is known. The MAD has been shown to be less efficient in estimating the standard deviation, but a more robust estimator (not as influenced by erroneous data) as the standard deviation. See related link. Most of the time we use the standard deviation to provide the best estimate of the variance of the population.

Yes. Standard deviation depends entirely upon the distribution; it is a measure of how spread out it is (ie how far from the mean "on average" the data is): the larger it is the more spread out it is, the smaller the less spread out. If every data point was the mean, the standard deviation would be zero!

No, if the standard deviation is small the data is less dispersed.

People also asked