answersLogoWhite

0

Standard deviation is the square root of the variance; so if the variance is 64, the std dev is 8.

User Avatar

Wiki User

15y ago

Still curious? Ask our experts.

Chat with our AI personalities

LaoLao
The path is yours to walk; I am only here to hold up a mirror.
Chat with Lao
RafaRafa
There's no fun in playing it safe. Why not try something a little unhinged?
Chat with Rafa
JudyJudy
Simplicity is my specialty.
Chat with Judy

Add your answer:

Earn +20 pts
Q: Suppose the variance is 64 Find the standard deviation?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Statistics

How do you calculate salary variance?

I believe you are interested in calculating the variance from a set of data related to salaries. Variance = square of the standard deviation, where: s= square root[sum (xi- mean)2/(n-1)] where mean of the set is the sum of all data divided by the number in the sample. X of i is a single data point (single salary). If instead of a sample of data, you have the entire population of size N, substitute N for n-1 in the above equation. You may find more information on the interpretation of variance, by searching wikipedia under variance and standard deviation. I note that an advantage of using the standard deviation rather than variance, is because the standard deviation will be in the same units as the mean.


How do you find the sample size if you are given the confidence interval and the margin of error as well as the standard deviation?

You can't. You need an estimate of p (p-hat) q-hat = 1 - p-hat variance = square of std dev sample size n= p-hat * q-hat/variance yes you can- it would be the confidence interval X standard deviation / margin of error then square the whole thing


How do you find the standard deviation for data?

Standard deviation calculation is somewhat difficult.Please refer to the site below for more info


How do i find sample standard deviation from population standard deviation?

If the population standard deviation is sigma, then the estimate for the sample standard error for a sample of size n, is s = sigma*sqrt[n/(n-1)]


How do you find highest standard deviation in given number?

Standard deviations are measures of data distributions. Therefore, a single number cannot have meaningful standard deviation.