The answer depends on absolute deviation from what: the mean, median or some other measure.
Suppose you have n observations, x1, x2, ... xn and you wish to calculate the sum of the absolute deviation of these observations from some fixed number c.
The deviation of x1 from c is (x1 - c).
The absolute deviation of x1 from c is |x1 - c|. This is the non-negative value of (x1 - c). That is,
if (x1 - c) ≤ 0 then |x1 - c| = (x1 - c)
while
if (x1 - c) < 0 then |(x1 - c)| = - (x1 - c).
Then the sum of absolute deviations is the above values, summed over x1, x2, ... xn.
No, a standard deviation or variance does not have a negative sign. The reason for this is that the deviations from the mean are squared in the formula. Deviations are squared to get rid of signs. In Absolute mean deviation, sum of the deviations is taken ignoring the signs, but there is no justification for doing so. (deviations are not squared here)
It is the mean absolute deviation.
Mean
variation
The sum of deviations from the mean, for any set of numbers, is always zero. For this reason it is quite useless.
The sum of standard deviations from the mean is the error.
You most certainly can. The standard deviation, however, has better statistical properties.
The definition of the mean x of a set of data is the sum of all the values divided by the total number of observations, and this value is in turn subtracted from each x value to calculate the deviations. When the deviations from the average are added up, the sum will always be zero because of the negative signs in the sum of deviations. Going back to the definition of the mean, the equation provided (x = Σxi/n) can be manipulated to read Σxi - x = 0
The sum of total deviations about the mean is the total variance. * * * * * No it is not - that is the sum of their SQUARES. The sum of the deviations is always zero.
You cannot use deviations from the mean because (by definition) their sum is zero. Absolute deviations are one way of getting around that problem and they are used. Their main drawback is that they treat deviations linearly. That is to say, one large deviation is only twice as important as two deviations that are half as big. That model may be appropriate in some cases. But in many cases, big deviations are much more serious than that a squared (not squarred) version is more appropriate. Conveniently the squared version is also a feature of many parametric statistical distributions and so the distribution of the "sum of squares" is well studied and understood.
No, a standard deviation or variance does not have a negative sign. The reason for this is that the deviations from the mean are squared in the formula. Deviations are squared to get rid of signs. In Absolute mean deviation, sum of the deviations is taken ignoring the signs, but there is no justification for doing so. (deviations are not squared here)
You want some measure of how the observations are spread about the mean. If you used the deviations their sum would be zero which would provide no useful information. You could use absolute deviations instead. The sum of squared deviations turns out to have some useful statistical properties including a relatively simple way of calculating it. For example, the Gaussian (or Normal) distribution is completely defined by its mean and variance.
For which measure of central tendency will the sum of the deviations always be zero?
Because the sum of the deviations would, by definition, always be zero. So there is nothing to be minimised to improve the fit.
It would be useful to know what the deviations were from.
Zero.
If you simply added the deviations, their sum would always be zero. The derived statistic would not add any information. Essentially, the choice was between summing the absolute values or taking the square root of the squares. The latter has some very useful statistical properties.