answersLogoWhite

0


Best Answer

The statistics of the population aren't supposed to depend on the sample size.

If they do, that just means that at least one of the samples doesn't accurately

represent the population. Maybe both.

User Avatar

Wiki User

14y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What happens to the sample size if you increase the standard deviation?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What happens if you estimate population standard deviation with the sample standard deviation?

Not a lot. After all, the sample sd is an estimate for the population sd.


What is the sample standard deviation of 27.5?

A single observation cannot have a sample standard deviation.


What happen to confidence interval if increase sample size and population standard deviation simultanesous?

The increase in sample size will reduce the confidence interval. The increase in standard deviation will increase the confidence interval. The confidence interval is not based on a linear function so the overall effect will require some calculations based on the levels before and after these changes. It would depend on the relative rates at which the change in sample size and change in standard deviation occurred. If the sample size increased more quickly than then standard deviation, in some sense, then the size of the confidence interval would decrease. Conversely, if the standard deviation increased more quickly than the sample size, in some sense, then the size of the confidence interval would increase.


What does the sample standard deviation best estimate?

The standard deviation of the population. the standard deviation of the population.


Can a standard deviation of a sample be equal to a standard deviation of a population?

Yes


How do i find sample standard deviation from population standard deviation?

If the population standard deviation is sigma, then the estimate for the sample standard error for a sample of size n, is s = sigma*sqrt[n/(n-1)]


How does one calculate the standard error of the sample mean?

Standard error of the sample mean is calculated dividing the the sample estimate of population standard deviation ("sample standard deviation") by the square root of sample size.


Is sample standard deviation the same as standard deviation?

No, the standard deviation is a measure of the entire population. The sample standard deviation is an unbiased estimator of the population. It is different in notation and is written as 's' as opposed to the greek letter sigma. Mathematically the difference is a factor of n/(n-1) in the variance of the sample. As you can see the value is greater than 1 so it will increase the value you get for your sample mean. Essentially, this covers for the fact that you are unlikely to obtain the full population variation when you sample.


What is the statistic that is used to estimate a population standard deviation?

the sample standard deviation


How many times would the sample size have to increase to cut the standard deviation by half?

Four.Four.Four.Four.


Sample standard deviation?

Standard deviation in statistics refers to how much deviation there is from the average or mean value. Sample deviation refers to the data that was collected from a smaller pool than the population.


As standard deviation increases what happens to the sample size in order to achieve a specified level of confidence?

decreases