answersLogoWhite

0

Standard deviation can be calculated using non-normal data, but isn't advised. You'll get abnormal results as the data isn't properly sorted, and the standard deviation will have a large window of accuracy.

User Avatar

Wiki User

12y ago

What else can I help you with?

Continue Learning about Other Math

Does variance and standard deviation assume nominal data?

No. Variance and standard deviation are dependent on, but calculated irrespective of the data. You do, of course, have to have some variation, otherwise, the variance and standard deviation will be zero.


What determines the standard deviation to be high?

Standard deviation is a measure of the scatter or dispersion of the data. Two sets of data can have the same mean, but different standard deviations. The dataset with the higher standard deviation will generally have values that are more scattered. We generally look at the standard deviation in relation to the mean. If the standard deviation is much smaller than the mean, we may consider that the data has low dipersion. If the standard deviation is much higher than the mean, it may indicate the dataset has high dispersion A second cause is an outlier, a value that is very different from the data. Sometimes it is a mistake. I will give you an example. Suppose I am measuring people's height, and I record all data in meters, except on height which I record in millimeters- 1000 times higher. This may cause an erroneous mean and standard deviation to be calculated.


What are the units of measurement of standard deviation?

Standard deviation has the same unit as the data set unit.


What a large standard deviation means?

A large standard deviation means that the data were spread out. It is relative whether or not you consider a standard deviation to be "large" or not, but a larger standard deviation always means that the data is more spread out than a smaller one. For example, if the mean was 60, and the standard deviation was 1, then this is a small standard deviation. The data is not spread out and a score of 74 or 43 would be highly unlikely, almost impossible. However, if the mean was 60 and the standard deviation was 20, then this would be a large standard deviation. The data is spread out more and a score of 74 or 43 wouldn't be odd or unusual at all.


What does standard deviation show us about a set of scores?

Standard Deviation tells you how spread out the set of scores are with respects to the mean. It measures the variability of the data. A small standard deviation implies that the data is close to the mean/average (+ or - a small range); the larger the standard deviation the more dispersed the data is from the mean.

Related Questions

Does variance and standard deviation assume nominal data?

No. Variance and standard deviation are dependent on, but calculated irrespective of the data. You do, of course, have to have some variation, otherwise, the variance and standard deviation will be zero.


Is the standard deviation of the data values in a sample is 17 what is the variance of the data values?

The variance of a set of data values is the square of the standard deviation. If the standard deviation is 17, the variance can be calculated as (17^2), which equals 289. Therefore, the variance of the data values in the sample is 289.


What would it mean if a standard deviation was calculated to equal 0?

A standard deviation of zero means that all the data points are the same value.


Why standard deviation is best measure of dispersion?

standard deviation is best measure of dispersion because all the data distributions are nearer to the normal distribution.


Is it true that The smaller the standard deviation of a normal curve the higher and narrower the graph?

Yes, that's true. In a normal distribution, a smaller standard deviation indicates that the data points are closer to the mean, resulting in a taller and narrower curve. Conversely, a larger standard deviation leads to a wider and shorter curve, reflecting more variability in the data. Thus, the standard deviation directly affects the shape of the normal distribution graph.


How many of scores will be within 1 standard deviation of the population mean?

Assuming a normal distribution 68 % of the data samples will be with 1 standard deviation of the mean.


What does standard deviation help you find?

Standard deviation helps you identify the relative level of variation from the mean or equation approximating the relationship in the data set. In a normal distribution 1 standard deviation left or right of the mean = 68.2% of the data 2 standard deviations left or right of the mean = 95.4% of the data 3 standard deviations left or right of the mean = 99.6% of the data


What is standard deviation used for?

Standard deviation is a measure of the spread of data.


How do you calculate plus or minus one standard deviation?

To calculate plus or minus one standard deviation from a mean, first determine the mean (average) of your data set. Then calculate the standard deviation, which measures the dispersion of the data points around the mean. Once you have both values, you can find the range by adding and subtracting the standard deviation from the mean: the lower limit is the mean minus one standard deviation, and the upper limit is the mean plus one standard deviation. This range contains approximately 68% of the data in a normal distribution.


If the standard deviation is small the data is more dispersed?

No, if the standard deviation is small the data is less dispersed.


Which factor does the width of the peak of a normal curve depend on?

The width of the peak of a normal curve depends primarily on the standard deviation of the distribution. A larger standard deviation results in a wider and flatter curve, indicating greater variability in the data, while a smaller standard deviation yields a narrower and taller peak, indicating less variability. Thus, the standard deviation is crucial for determining the spread of the data around the mean.


Is the normal distribution always being defined by the mean and standard deviation?

Yes, the normal distribution is uniquely defined by its mean and standard deviation. The mean determines the center of the distribution, while the standard deviation indicates the spread or dispersion of the data. Together, these two parameters specify the shape and location of the normal distribution curve.