answersLogoWhite

0

The standard deviation and the arithmetic mean measure two different characteristics of a set of data. The standard deviation measures how spread out the data is, whereas the arithmetic mean measures where the data is centered. Because of this, there is no particular relation that must be satisfied because the standard deviation is greater than the mean.

Actually, there IS a relationship between the mean and standard deviation. A high (large) standard deviation indicates a wide range of scores = a great deal of variance. Generally speaking, the greater the range of scores, the less representative the mean becomes (if we are using "mean" to indicate "normal"). For example, consider the following example:

10 students are given a test that is worth 100 points. Only 1 student gets a 100, 2 students receive a zero, and the remaining 7 students get a score of 50.

(Arithmetic mean) = 100 + 0(2) + 7(50) = 100 + 0 + 350 = 450/10 students

SCORE = 45

In statistics, the median refers to the value at the 50% percentile. That means that half of the scores fall below the median & the other half are above the median. Using the example above, the scores are: 0, 0, 50, 50, (50, 50), 50, 50, 50, 100. The median is the score that has the same number of occurrences above it and below it. For an odd number of scores, there is exactly one in the middle, and that would be the median. Using this example, we have an even number of scores, so the "middle 2" scores are averaged for the median value. These "middle" scores are bracketed by parenthesis in the list, and in this case are both equal to 50 (which average to 50, so the median is 50). In this case, the standard deviation of these scores is 26.9, which indicates a fairly wide "spread" of the numbers. For a "normal" distribution, most of the scores should center around the same value (in this case 50, which is also known as the "mode" - or the score that occurs most frequently) & as you move towards the extremes (very high or very low values), there should be fewer scores.

User Avatar

Wiki User

14y ago

What else can I help you with?

Related Questions

Can standard deviation be greater than mean?

Standard deviation can be greater than the mean.


What does it indicate if the mean is greater than the standard deviation?

It does not indicate anything if the mean is greater than the standard deviation.


Can the mean be less than the standard deviation?

In general, a mean can be greater or less than the standard deviation.


How do you calculate mean and Median smaller then Standard deviation?

In the same way that you calculate mean and median that are greater than the standard deviation!


Is the mean for a set of data always greater than the standard deviation?

Yes; the standard deviation is the square root of the mean, so it will always be larger.


Is mean deviation less than standard deviation?

Yes, the mean deviation is typically less than or equal to the standard deviation for a given dataset. The mean deviation measures the average absolute deviations from the mean, while the standard deviation takes into account the squared deviations, which can amplify the effect of outliers. Consequently, the standard deviation is usually greater than or equal to the mean deviation, but they can be equal in certain cases, such as when all data points are identical.


What are all the values a standard deviation can take?

The standard deviation must be greater than or equal to zero.


Why is the standard error a smaller numerical value compared to the standard deviation?

Let sigma = standard deviation. Standard error (of the sample mean) = sigma / square root of (n), where n is the sample size. Since you are dividing the standard deviation by a positive number greater than 1, the standard error is always smaller than the standard deviation.


What is the difference between a general normal curve and a standard normal curve?

A standard normal distribution has a mean of zero and a standard deviation of 1. A normal distribution can have any real number as a mean and the standard deviation must be greater than zero.


Can coefficient of variation be greater than 100?

Yes, the coefficient of variation (CV) can be greater than 100%. The CV is calculated as the ratio of the standard deviation to the mean, expressed as a percentage. If the standard deviation is greater than the mean, which can occur in certain datasets, the CV will exceed 100%, indicating high relative variability compared to the average value.


Is the standard deviation best thought of as the distance from the mean?

No. A small standard deviation with a large mean will yield points further from the mean than a large standard deviation of a small mean. Standard deviation is best thought of as spread or dispersion.


What is the z score of 1.0?

It is the value that is one standard deviation greater than the mean of a Normal (Gaussian) distribution.