Best Answer

Generally not without further reason. Extreme values are often called outliers. Eliminating unusually high values will lower the standard deviation. You may want to calculate standard deviations with and without the extreme values to identify their impact on calculations. See related link for additional discussion.

Q: When computing Standard deviation should you eliminate the extreme values?

Write your answer...

Submit

Still have questions?

Continue Learning about Math & Arithmetic

The reason the standard deviation of a distribution of means is smaller than the standard deviation of the population from which it was derived is actually quite logical. Keep in mind that standard deviation is the square root of variance. Variance is quite simply an expression of the variation among values in the population. Each of the means within the distribution of means is comprised of a sample of values taken randomly from the population. While it is possible for a random sample of multiple values to have come from one extreme or the other of the population distribution, it is unlikely. Generally, each sample will consist of some values on the lower end of the distribution, some from the higher end, and most from near the middle. In most cases, the values (both extremes and middle values) within each sample will balance out and average out to somewhere toward the middle of the population distribution. So the mean of each sample is likely to be close to the mean of the population and unlikely to be extreme in either direction. Because the majority of the means in a distribution of means will fall closer to the population mean than many of the individual values in the population, there is less variation among the distribution of means than among individual values in the population from which it was derived. Because there is less variation, the variance is lower, and thus, the square root of the variance - the standard deviation of the distribution of means - is less than the standard deviation of the population from which it was derived.

Because the standard deviation is based on the square root of the sum of the squares of the deviations, and, as a result, the sum of the squares of the deviations puts more weight in outliers than does a simple arithmetic mean.Note: I wrote this and then had second thoughts, but I'm keeping it in so that someone with more knowledge can weigh in (pun intended). I'm not certain how the arithmetic mean factors into the question. I think the questioner, and definitely this answerer, is confused.

It depends entirely on the variance (or standard error).

The z-score is used to convert a variable with a Gaussian [Normal] distribution with mean m and standard error s to a variable with a standard normal distribution. Since the latter is tabulated, the probability of an outcome as extreme or more compared to the one observed is easily obtained.

no

Related questions

The median is least affected by an extreme outlier. Mean and standard deviation ARE affected by extreme outliers.

extreme lack of attention to medical care

The project sponsored by the United Nations to eliminate extreme poverty worldwide is called the Millenium Project.

The reason the standard deviation of a distribution of means is smaller than the standard deviation of the population from which it was derived is actually quite logical. Keep in mind that standard deviation is the square root of variance. Variance is quite simply an expression of the variation among values in the population. Each of the means within the distribution of means is comprised of a sample of values taken randomly from the population. While it is possible for a random sample of multiple values to have come from one extreme or the other of the population distribution, it is unlikely. Generally, each sample will consist of some values on the lower end of the distribution, some from the higher end, and most from near the middle. In most cases, the values (both extremes and middle values) within each sample will balance out and average out to somewhere toward the middle of the population distribution. So the mean of each sample is likely to be close to the mean of the population and unlikely to be extreme in either direction. Because the majority of the means in a distribution of means will fall closer to the population mean than many of the individual values in the population, there is less variation among the distribution of means than among individual values in the population from which it was derived. Because there is less variation, the variance is lower, and thus, the square root of the variance - the standard deviation of the distribution of means - is less than the standard deviation of the population from which it was derived.

In regards to Dell Computing it stands for eXtreme Performance System.

Directional selection

It would mean that the result was 2 standard deviations above the mean. Depending on the distribution of the variable, it may be possible to attach a probability to this, or more extreme, observations.It would mean that the result was 2 standard deviations above the mean. Depending on the distribution of the variable, it may be possible to attach a probability to this, or more extreme, observations.It would mean that the result was 2 standard deviations above the mean. Depending on the distribution of the variable, it may be possible to attach a probability to this, or more extreme, observations.It would mean that the result was 2 standard deviations above the mean. Depending on the distribution of the variable, it may be possible to attach a probability to this, or more extreme, observations.

Because the standard deviation is based on the square root of the sum of the squares of the deviations, and, as a result, the sum of the squares of the deviations puts more weight in outliers than does a simple arithmetic mean.Note: I wrote this and then had second thoughts, but I'm keeping it in so that someone with more knowledge can weigh in (pun intended). I'm not certain how the arithmetic mean factors into the question. I think the questioner, and definitely this answerer, is confused.

When you are looking for a simple measure of the spread of the data, but one which is protected from the effects of extreme values (outliers).

organisms on one extreme of the population have a better chance to survive than do those on the other extreme.

If you have a variable X distributed with mean m and standard deviation s, then the z-score is (x - m)/s. If X is normally distributed, or is the mean of a random sample then Z has a Standard Normal distribution: that is, a Gaussian distribution with mean 0 and variance 1. The probability density function of Z is tabulated so that you can check the probability of observing a value as much or more extreme.

On the extreme western edge of Arizona