answersLogoWhite

0


Best Answer

The answer depends on absolute deviation from what: the mean, median or some other measure.

Suppose you have n observations, x1, x2, ... xn and you wish to calculate the sum of the absolute deviation of these observations from some fixed number c.

The deviation of x1 from c is (x1 - c).

The absolute deviation of x1 from c is |x1 - c|. This is the non-negative value of (x1 - c). That is,

if (x1 - c) ≤ 0 then |x1 - c| = (x1 - c)

while

if (x1 - c) < 0 then |(x1 - c)| = - (x1 - c).

Then the sum of absolute deviations is the above values, summed over x1, x2, ... xn.

User Avatar

Wiki User

10y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Definition for Sum of Absolute Deviations?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What is the sum of the deviations from the mean?

The sum of standard deviations from the mean is the error.


The sum of the deviations from the mean is always zero?

The definition of the mean x of a set of data is the sum of all the values divided by the total number of observations, and this value is in turn subtracted from each x value to calculate the deviations. When the deviations from the average are added up, the sum will always be zero because of the negative signs in the sum of deviations. Going back to the definition of the mean, the equation provided (x = &Sigma;xi/n) can be manipulated to read &Sigma;xi - x = 0


Why do you not take the sum of absolute deviations?

You most certainly can. The standard deviation, however, has better statistical properties.


The sum of the deviations about the mean always equals what?

The sum of total deviations about the mean is the total variance. * * * * * No it is not - that is the sum of their SQUARES. The sum of the deviations is always zero.


Why use the squarred version of the sum of deviations from the mean?

You cannot use deviations from the mean because (by definition) their sum is zero. Absolute deviations are one way of getting around that problem and they are used. Their main drawback is that they treat deviations linearly. That is to say, one large deviation is only twice as important as two deviations that are half as big. That model may be appropriate in some cases. But in many cases, big deviations are much more serious than that a squared (not squarred) version is more appropriate. Conveniently the squared version is also a feature of many parametric statistical distributions and so the distribution of the "sum of squares" is well studied and understood.


Can the standard deviation or variance be negative?

No, a standard deviation or variance does not have a negative sign. The reason for this is that the deviations from the mean are squared in the formula. Deviations are squared to get rid of signs. In Absolute mean deviation, sum of the deviations is taken ignoring the signs, but there is no justification for doing so. (deviations are not squared here)


When computing the sample variance the sum of squared deviations about the mean is used for what reason?

You want some measure of how the observations are spread about the mean. If you used the deviations their sum would be zero which would provide no useful information. You could use absolute deviations instead. The sum of squared deviations turns out to have some useful statistical properties including a relatively simple way of calculating it. For example, the Gaussian (or Normal) distribution is completely defined by its mean and variance.


Which measure of central tendency will the sum of the deviations always be zero?

For which measure of central tendency will the sum of the deviations always be zero?


Why the square in least square method?

Because the sum of the deviations would, by definition, always be zero. So there is nothing to be minimised to improve the fit.


How do you find the sum of squared deviations in a set of numbers?

It would be useful to know what the deviations were from.


What does the sum of the deviations from the mean equal?

Zero.


Why do you square the deviations and then turn around and find the square root of the sum of the squares?

If you simply added the deviations, their sum would always be zero. The derived statistic would not add any information. Essentially, the choice was between summing the absolute values or taking the square root of the squares. The latter has some very useful statistical properties.