answersLogoWhite

0

No. Standard deviation is the square root of the mean of the squared deviations from the mean. Also, if the mean of the data is determined by the same process as the deviation from the mean, then you loose one degree of freedom, and the divisor in the calculation should be N-1, instead of just N.

User Avatar

Wiki User

14y ago

Still curious? Ask our experts.

Chat with our AI personalities

TaigaTaiga
Every great hero faces trials, and you—yes, YOU—are no exception!
Chat with Taiga
EzraEzra
Faith is not about having all the answers, but learning to ask the right questions.
Chat with Ezra
DevinDevin
I've poured enough drinks to know that people don't always want advice—they just want to talk.
Chat with Devin

Add your answer:

Earn +20 pts
Q: Is standard deviation the arithmetic mean of the squared deviations from the means?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Other Math

Can the standard deviation or variance be negative?

No, a standard deviation or variance does not have a negative sign. The reason for this is that the deviations from the mean are squared in the formula. Deviations are squared to get rid of signs. In Absolute mean deviation, sum of the deviations is taken ignoring the signs, but there is no justification for doing so. (deviations are not squared here)


What is the square root of the average of the squared deviations from the mean?

It is the standard deviation.


How is standard deviation different from mean absolute decation?

If I have understood the question correctly, despite your challenging spelling, the standard deviation is the square root of the average of the squared deviations while the mean absolute deviation is the average of the deviation. One consequence of this difference is that a large deviation affects the standard deviation more than it affects the mean absolute deviation.


Is variance the square root of standard deviation?

No, you have it backwards, the standard deviation is the square root of the variance, so the variance is the standard deviation squared. Usually you find the variance first, as it is the average sum of squares of the distribution, and then find the standard deviation by squaring it.


Why standard deviation is more often used than variance?

Both variance and standard deviation are measures of dispersion or variability in a set of data. They both measure how far the observations are scattered away from the mean (or average). While computing the variance, you compute the deviation of each observation from the mean, square it and sum all of the squared deviations. This somewhat exaggerates the true picure because the numbers become large when you square them. So, we take the square root of the variance (to compensate for the excess) and this is known as the standard deviation. This is why the standard deviation is more often used than variance but the standard deviation is just the square root of the variance.