answersLogoWhite

0


Best Answer

Standard deviation is the square root of the sum of the squares of the deviations of each item from the mean, i.e. the square root of the variance. In order to increase the standard deviation, therefore, you need to increase the average deviation from the mean.

There are many ways to do this. One is to move each item further away from the mean. For example, take the set [2, 4, 4, 4, 5, 5, 7, 9]. It has a mean of 5 and a standard deviation of 2.14. Multiply each item by 2.2 and subtract 5, giving the set [-1.3, 2.9, 2.9, 2.9, 5, 5, 9.2, 13.4], effectively moving each item 10% further away from the mean. This still has a mean of 5, but the standard deviation is 4.49.

User Avatar

Wiki User

12y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How do you create data set with larger standard deviation?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What does standard deviation show us about a set of scores?

Standard Deviation tells you how spread out the set of scores are with respects to the mean. It measures the variability of the data. A small standard deviation implies that the data is close to the mean/average (+ or - a small range); the larger the standard deviation the more dispersed the data is from the mean.


What is the relationship between the relative size of the starndard deviation and the kurtosis of a distribution?

It is inversely proportional; a larger standard deviation produces a small kurtosis (smaller peak, more spread out data) and a smaller standard deviation produces a larger kurtosis (larger peak, data more centrally located).


Is the mean for a set of data always greater than the standard deviation?

Yes; the standard deviation is the square root of the mean, so it will always be larger.


What if I have a very high standard deviation?

The larger the value of the standard deviation, the more the data values are scattered and the less accurate any results are likely to be.


What a large standard deviation means?

A large standard deviation means that the data were spread out. It is relative whether or not you consider a standard deviation to be "large" or not, but a larger standard deviation always means that the data is more spread out than a smaller one. For example, if the mean was 60, and the standard deviation was 1, then this is a small standard deviation. The data is not spread out and a score of 74 or 43 would be highly unlikely, almost impossible. However, if the mean was 60 and the standard deviation was 20, then this would be a large standard deviation. The data is spread out more and a score of 74 or 43 wouldn't be odd or unusual at all.


What is standard deviation used for?

Standard deviation is a measure of the spread of data.


If the standard deviation is small the data is more dispersed?

No, if the standard deviation is small the data is less dispersed.


Can a standard deviation be less than 1?

Yes. Standard deviation depends entirely upon the distribution; it is a measure of how spread out it is (ie how far from the mean "on average" the data is): the larger it is the more spread out it is, the smaller the less spread out. If every data point was the mean, the standard deviation would be zero!


What does one standard deviation mean?

Standard deviation is a measure of variation from the mean of a data set. 1 standard deviation from the mean (which is usually + and - from mean) contains 68% of the data.


Why do we need the standard deviation?

The standard deviation is a measure of the spread of data.


Relation between mean and standard deviation?

Standard deviation is the variance from the mean of the data.


Does variance and standard deviation assume nominal data?

No. Variance and standard deviation are dependent on, but calculated irrespective of the data. You do, of course, have to have some variation, otherwise, the variance and standard deviation will be zero.