answersLogoWhite

0


Best Answer

Any real value >= 0.

User Avatar

Wiki User

11y ago
This answer is:
User Avatar
More answers
User Avatar

Anonymous

Lvl 1
4y ago

If you know that the amount of deviations from five values ​​from their mean, it is as follows 1-, 3, 4-, 2, 5- The value of the standard deviation is?

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What are all the values that a standard deviation can possibly take?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Statistics

What are all the values a standard deviation can take?

The standard deviation must be greater than or equal to zero.


What is the relationship between the mean and standard deviation in statistics?

The 'standard deviation' in statistics or probability is a measure of how spread out the numbers are. It mathematical terms, it is the square root of the mean of the squared deviations of all the numbers in the data set from the mean of that set. It is approximately equal to the average deviation from the mean. If you have a set of values with low standard deviation, it means that in general, most of the values are close to the mean. A high standard deviation means that the values in general, differ a lot from the mean. The variance is the standard deviation squared. That is to say, the standard deviation is the square root of the variance. To calculate the variance, we simply take each number in the set and subtract it from the mean. Next square that value and do the same for each number in the set. Lastly, take the mean of all the squares. The mean of the squared deviation from the mean is the variance. The square root of the variance is the standard deviation. If you take the following data series for example, the mean for all of them is '3'. 3, 3, 3, 3, 3, 3 all the values are 3, they're the same as the mean. The standard deviation is zero. This is because the difference from the mean is zero in each case, and after squaring and then taking the mean, the variance is zero. Last, the square root of zero is zero so the standard deviation is zero. Of note is that since you are squaring the deviations from the mean, the variance and hence the standard deviation can never be negative. 1, 3, 3, 3, 3, 5 - most of the values are the same as the mean. This has a low standard deviation. In this case, the standard deviation is very small since most of the difference from the mean are small. 1, 1, 1, 5, 5, 5 - all the values are two higher or two lower than the mean. This series has the highest standard deviation.


Is it possible for a standard deviation to be negative?

no it is not possible because you have to take the square of error that is (x-X)2. the square of any number is always positive----------Improved answer:It is not possible to have a negative standard deviation because:SD (standard deviation) is equal to the square of V (variance).


How does a sample size impact the standard deviation?

If I take 10 items (a small sample) from a population and calculate the standard deviation, then I take 100 items (larger sample), and calculate the standard deviation, how will my statistics change? The smaller sample could have a higher, lower or about equal the standard deviation of the larger sample. It's also possible that the smaller sample could be, by chance, closer to the standard deviation of the population. However, A properly taken larger sample will, in general, be a more reliable estimate of the standard deviation of the population than a smaller one. There are mathematical equations to show this, that in the long run, larger samples provide better estimates. This is generally but not always true. If your population is changing as you are collecting data, then a very large sample may not be representative as it takes time to collect.


What is the purpose of finding standard deviation?

The standard deviation of a set of data is a measure of the random variability present in the data. Given any two sets of data it is extremely unlikely that their means will be exactly the same. The standard deviation is used to determine whether the difference between the means of the two data sets is something that could happen purely by chance (ie is reasonable) or not.Also, if you wish to take samples of a population, then the inherent variability - as measured by the standard deviation - is a useful measure to help determine the optimum sample size.

Related questions

What are all the values that a standard deviation s can possibly take?

Any non-negative value.


What are all the values a standard deviation can take?

The standard deviation must be greater than or equal to zero.


A set of 1000 values has a normal distribution the mean of the data is 120 and the standard deviation is 20 how many values are within one standard deviaiton from the mean?

The Empirical Rule states that 68% of the data falls within 1 standard deviation from the mean. Since 1000 data values are given, take .68*1000 and you have 680 values are within 1 standard deviation from the mean.


What is the relationship between the mean and standard deviation in statistics?

The 'standard deviation' in statistics or probability is a measure of how spread out the numbers are. It mathematical terms, it is the square root of the mean of the squared deviations of all the numbers in the data set from the mean of that set. It is approximately equal to the average deviation from the mean. If you have a set of values with low standard deviation, it means that in general, most of the values are close to the mean. A high standard deviation means that the values in general, differ a lot from the mean. The variance is the standard deviation squared. That is to say, the standard deviation is the square root of the variance. To calculate the variance, we simply take each number in the set and subtract it from the mean. Next square that value and do the same for each number in the set. Lastly, take the mean of all the squares. The mean of the squared deviation from the mean is the variance. The square root of the variance is the standard deviation. If you take the following data series for example, the mean for all of them is '3'. 3, 3, 3, 3, 3, 3 all the values are 3, they're the same as the mean. The standard deviation is zero. This is because the difference from the mean is zero in each case, and after squaring and then taking the mean, the variance is zero. Last, the square root of zero is zero so the standard deviation is zero. Of note is that since you are squaring the deviations from the mean, the variance and hence the standard deviation can never be negative. 1, 3, 3, 3, 3, 5 - most of the values are the same as the mean. This has a low standard deviation. In this case, the standard deviation is very small since most of the difference from the mean are small. 1, 1, 1, 5, 5, 5 - all the values are two higher or two lower than the mean. This series has the highest standard deviation.


Is it possible for a standard deviation to be negative?

no it is not possible because you have to take the square of error that is (x-X)2. the square of any number is always positive----------Improved answer:It is not possible to have a negative standard deviation because:SD (standard deviation) is equal to the square of V (variance).


You take an SRS of size n from a population that has mean 80 and standard deviation 20 How big should n be so that the sampling distribution has standard deviation 1?

400


How do you calculate standard deviation without a normal distribution?

You calculate standard deviation the same way as always. You find the mean, and then you sum the squares of the deviations of the samples from the means, divide by N-1, and then take the square root. This has nothing to do with whether you have a normal distribution or not. This is how you calculate sample standard deviation, where the mean is determined along with the standard deviation, and the N-1 factor represents the loss of a degree of freedom in doing so. If you knew the mean a priori, you could calculate standard deviation of the sample, and only use N, instead of N-1.


Why standard deviation is more often used than variance?

Both variance and standard deviation are measures of dispersion or variability in a set of data. They both measure how far the observations are scattered away from the mean (or average). While computing the variance, you compute the deviation of each observation from the mean, square it and sum all of the squared deviations. This somewhat exaggerates the true picure because the numbers become large when you square them. So, we take the square root of the variance (to compensate for the excess) and this is known as the standard deviation. This is why the standard deviation is more often used than variance but the standard deviation is just the square root of the variance.


How To Calculate Standard Deviation?

You can calculate standard deviation by addin the numbers of data that are together and dividing that number by the amount pieces of data.THAT IS TOTALLY INCORRECT.What was answered above was the calculation for getting an (mean) average.If you take five numbers for example 1, 2, 3, 4, 5 then the (mean) average is 3.But the standard deviation between them is 1.58814 and the variance is 2.5Also the population std. deviation will be 1.41421 and the population variance will be 2.see standard-deviation.appspot.com/


Why to take square in the formula of standard deviation?

The sum of deviations from the mean will always be 0 and so does not provide any useful information. The absolute deviation is one solution to tat, the other is to take the square - and then take a square root.


Why do you not take the sum of absolute deviations?

You most certainly can. The standard deviation, however, has better statistical properties.


How does a sample size impact the standard deviation?

If I take 10 items (a small sample) from a population and calculate the standard deviation, then I take 100 items (larger sample), and calculate the standard deviation, how will my statistics change? The smaller sample could have a higher, lower or about equal the standard deviation of the larger sample. It's also possible that the smaller sample could be, by chance, closer to the standard deviation of the population. However, A properly taken larger sample will, in general, be a more reliable estimate of the standard deviation of the population than a smaller one. There are mathematical equations to show this, that in the long run, larger samples provide better estimates. This is generally but not always true. If your population is changing as you are collecting data, then a very large sample may not be representative as it takes time to collect.