answersLogoWhite

0

In statistics, sigma is a measure of the standard error of a variable. That is a measure of the spread of the variable around its mean value.

Many variables are distributed approximately according to the Gaussian (Normal) distribution. Even when they are not, the means of repeated samples are (Central Limit Theorem). For the Gaussian distribution, 95% of the observations lie within 1.96*sigma from the mean. This is sometimes rounded to two sigma. While for an exact Gaussian distribution 2 sigma would imply 95.45%, for approximate Gaussian, it is still "around" 95%.

Thus, for example, average IQ (whatever it measures, which certainly is not intelligence!) is 100 and sigma = 15. So 95% of the population will have IQs between 100-2*sigma and 100+2*sigma, which is 70 and 130.

By the way, if you think my comment about what IQs measure is sour grapes, I assure you that is nowhere near the truth!

User Avatar

Wiki User

11y ago

What else can I help you with?