answersLogoWhite

0

The standard deviation of a set of data is a measure of the random variability present in the data. Given any two sets of data it is extremely unlikely that their means will be exactly the same. The standard deviation is used to determine whether the difference between the means of the two data sets is something that could happen purely by chance (ie is reasonable) or not.

Also, if you wish to take samples of a population, then the inherent variability - as measured by the standard deviation - is a useful measure to help determine the optimum sample size.

User Avatar

Wiki User

10y ago

Still curious? Ask our experts.

Chat with our AI personalities

SteveSteve
Knowledge is a journey, you know? We'll get there.
Chat with Steve
FranFran
I've made my fair share of mistakes, and if I can help you avoid a few, I'd sure like to try.
Chat with Fran
TaigaTaiga
Every great hero faces trials, and you—yes, YOU—are no exception!
Chat with Taiga

Add your answer:

Earn +20 pts
Q: What is the purpose of finding standard deviation?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Statistics

What are the assumptions of standard deviation?

The standard deviation is the standard deviation! Its calculation requires no assumption.


What is the difference between standard error of mean and standard deviation of means?

Standard error of the mean (SEM) and standard deviation of the mean is the same thing. However, standard deviation is not the same as the SEM. To obtain SEM from the standard deviation, divide the standard deviation by the square root of the sample size.


Difference Standard Deviation of a portfolio?

difference standard deviation of portfolio


What is the purpose of finding the standard deviation of a data set?

The purpose of obtaining the standard deviation is to measure the dispersion data has from the mean. Data sets can be widely dispersed, or narrowly dispersed. The standard deviation measures the degree of dispersion. Each standard deviation has a percentage probability that a single datum will fall within that distance from the mean. One standard deviation of a normal distribution contains 66.67% of all data in a particular data set. Therefore, any single datum in the data has a 66.67% chance of falling within one standard deviation from the mean. 95% of all data in the data set will fall within two standard deviations of the mean. So, how does this help us in the real world? Well, I will use the world of finance/investments to illustrate real world application. In finance, we use the standard deviation and variance to measure risk of a particular investment. Assume the mean is 15%. That would indicate that we expect to earn a 15% return on an investment. However, we never earn what we expect, so we use the standard deviation to measure the likelihood the expected return will fall away from that expected return (or mean). If the standard deviation is 2%, we have a 66.67% chance the return will actually be between 13% and 17%. We expect a 95% chance that the return on the investment will yield an 11% to 19% return. The larger the standard deviation, the greater the risk involved with a particular investment. That is a real world example of how we use the standard deviation to measure risk, and expected return on an investment.


What is the square of the standard deviation called?

The square of the standard deviation is called the variance. That is because the standard deviation is defined as the square root of the variance.