answersLogoWhite

0


Best Answer

You cannot use deviations from the mean because (by definition) their sum is zero.

Absolute deviations are one way of getting around that problem and they are used. Their main drawback is that they treat deviations linearly. That is to say, one large deviation is only twice as important as two deviations that are half as big. That model may be appropriate in some cases.

But in many cases, big deviations are much more serious than that a squared (not squarred) version is more appropriate.

Conveniently the squared version is also a feature of many parametric statistical distributions and so the distribution of the "sum of squares" is well studied and understood.

User Avatar

Wiki User

11y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Why use the squarred version of the sum of deviations from the mean?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What is the sum of the deviations from the mean?

The sum of standard deviations from the mean is the error.


The sum of the deviations about the mean always equals what?

The sum of total deviations about the mean is the total variance. * * * * * No it is not - that is the sum of their SQUARES. The sum of the deviations is always zero.


What does the sum of the deviations from the mean equal?

Zero.


The sum of the deviations from the mean is always?

0 (zero).


How do you calculate the sum of the deviations from the mean?

multiply the mean by the amount of numbers


The sum of deviations of the individual data elements from their mean is?

zero


For which measure of central tendency will the sum of the deviations always be zero?

Mean


The sum of the deviations from the mean is always zero?

The definition of the mean x of a set of data is the sum of all the values divided by the total number of observations, and this value is in turn subtracted from each x value to calculate the deviations. When the deviations from the average are added up, the sum will always be zero because of the negative signs in the sum of deviations. Going back to the definition of the mean, the equation provided (x = Σxi/n) can be manipulated to read Σxi - x = 0


What is the Sum of deviation from the mean is?

The sum of deviations from the mean, for any set of numbers, is always zero. For this reason it is quite useless.


What is the sum of the squared deviations from the mean divided by the count minus one?

variation


When computing the sample variance the sum of squared deviations about the mean is used for what reason?

You want some measure of how the observations are spread about the mean. If you used the deviations their sum would be zero which would provide no useful information. You could use absolute deviations instead. The sum of squared deviations turns out to have some useful statistical properties including a relatively simple way of calculating it. For example, the Gaussian (or Normal) distribution is completely defined by its mean and variance.


Can the standard deviation or variance be negative?

No, a standard deviation or variance does not have a negative sign. The reason for this is that the deviations from the mean are squared in the formula. Deviations are squared to get rid of signs. In Absolute mean deviation, sum of the deviations is taken ignoring the signs, but there is no justification for doing so. (deviations are not squared here)