answersLogoWhite

0


Best Answer

Assuming a normal distribution 68 % of the data samples will be with 1 standard deviation of the mean.

User Avatar

Wiki User

6y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How many of scores will be within 1 standard deviation of the population mean?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Statistics

What percent of the scores in a normal distribution will fall within one standard deviation?

It is 68.3%


What percentage of the normally distributed population lies within the plus or minus one standard deviation of the population mean?

68.2%


What are importance of mean and standard deviation in the use of normal distribution?

For data sets having a normal distribution, the following properties depend on the mean and the standard deviation. This is known as the Empirical rule. About 68% of all values fall within 1 standard deviation of the mean About 95% of all values fall within 2 standard deviation of the mean About 99.7% of all values fall within 3 standard deviation of the mean. So given any value and given the mean and standard deviation, one can say right away where that value is compared to 60, 95 and 99 percent of the other values. The mean of the any distribution is a measure of centrality, but in case of the normal distribution, it is equal to the mode and median of the distribtion. The standard deviation is a measure of data dispersion or variability. In the case of the normal distribution, the mean and the standard deviation are the two parameters of the distribution, therefore they completely define the distribution. See: http://en.wikipedia.org/wiki/Normal_distribution


What percentage of scores fall within -3 and plus 3 standard deviations around the mean in a normal distribution?

99.7% of scores fall within -3 and plus 3 standard deviations around the mean in a normal distribution.


What is the purpose of finding the standard deviation of a data set?

The purpose of obtaining the standard deviation is to measure the dispersion data has from the mean. Data sets can be widely dispersed, or narrowly dispersed. The standard deviation measures the degree of dispersion. Each standard deviation has a percentage probability that a single datum will fall within that distance from the mean. One standard deviation of a normal distribution contains 66.67% of all data in a particular data set. Therefore, any single datum in the data has a 66.67% chance of falling within one standard deviation from the mean. 95% of all data in the data set will fall within two standard deviations of the mean. So, how does this help us in the real world? Well, I will use the world of finance/investments to illustrate real world application. In finance, we use the standard deviation and variance to measure risk of a particular investment. Assume the mean is 15%. That would indicate that we expect to earn a 15% return on an investment. However, we never earn what we expect, so we use the standard deviation to measure the likelihood the expected return will fall away from that expected return (or mean). If the standard deviation is 2%, we have a 66.67% chance the return will actually be between 13% and 17%. We expect a 95% chance that the return on the investment will yield an 11% to 19% return. The larger the standard deviation, the greater the risk involved with a particular investment. That is a real world example of how we use the standard deviation to measure risk, and expected return on an investment.

Related questions

Assume that aset of test scores is normally distributed with a mean of 100 and a standard deviation of 20 use the 68-95-99?

68% of the scores are within 1 standard deviation of the mean -80, 120 95% of the scores are within 2 standard deviations of the mean -60, 140 99.7% of the scores are within 3 standard deviations of the mean -40, 180


What percent of the scores in a normal distribution will fall within one standard deviation?

It is 68.3%


What percentage of the normally distributed population lies within the plus or minus one standard deviation of the population mean?

68.2%


Standard deviation of 10 and mean of 50 approximately 68 percent of the group members receive scores somewhere between?

Within 1 stdev of the mean - between 40 and 60.


What percent of a normal population is within 2 standard deviations of the mean?

By the definition of standard deviation, 95.46% of the normal population will be within 2 SD of the mean. Explanation: The normal distribution of a population means it follows the "bell curve". The center of this bell curve is the population's mean value. One standard deviation defines two areas (on the left and right side of the central "mean" value) under the bell curve that each have 34.13% of the population. The next standard deviation adds two additional areas under the curve, each having 13.6% of the population. Adding the areas under the curves on both sides gives us (34.13% + 13.6%) x 2 = 95.46%


Why is the standard deviation one?

The standard deviation provides in indication of what proportion of the entire distribution of the sample falls within a certain distance from the mean or average for that sample. If your data falls on a normal (or bell shaped) distribution, a SD of 1 indicates that about 68% of your data points (scores or whatever else) fall within 1 point (plus or minus) of the average (mean) of the data, and 95% fall within 2 points.


Why is the standard deviation of a distribution of means smaller than the standard deviation of the population from which it was derived?

The reason the standard deviation of a distribution of means is smaller than the standard deviation of the population from which it was derived is actually quite logical. Keep in mind that standard deviation is the square root of variance. Variance is quite simply an expression of the variation among values in the population. Each of the means within the distribution of means is comprised of a sample of values taken randomly from the population. While it is possible for a random sample of multiple values to have come from one extreme or the other of the population distribution, it is unlikely. Generally, each sample will consist of some values on the lower end of the distribution, some from the higher end, and most from near the middle. In most cases, the values (both extremes and middle values) within each sample will balance out and average out to somewhere toward the middle of the population distribution. So the mean of each sample is likely to be close to the mean of the population and unlikely to be extreme in either direction. Because the majority of the means in a distribution of means will fall closer to the population mean than many of the individual values in the population, there is less variation among the distribution of means than among individual values in the population from which it was derived. Because there is less variation, the variance is lower, and thus, the square root of the variance - the standard deviation of the distribution of means - is less than the standard deviation of the population from which it was derived.


A What is empirical rule?

For data sets having a normal, bell-shaped distribution, the following properties apply: About 68% of all values fall within 1 standard deviation of the mean About 95% of all values fall within 2 standard deviation of the mean About 99.7% of all values fall within 3 standard deviation of the mean.


A set of 1000 values has a normal distribution the mean of the data is 120 and the standard deviation is 20 how many values are within one standard deviaiton from the mean?

The Empirical Rule states that 68% of the data falls within 1 standard deviation from the mean. Since 1000 data values are given, take .68*1000 and you have 680 values are within 1 standard deviation from the mean.


Is standard deviation an absolute value?

No. Standard deviation is not an absolute value. The standard deviation is often written as a single positive value (magnitude), but it is really a binomial, and it equals both the positive and negative of the given magnitude. For example, if you are told that for a population the SD is 5.0, it really means +5.0 and -5.0 from the population mean. It defines a region within the distribution, starting at the lower magnitude (-5.0) increasing to zero (the mean), and another region starting at zero (the mean) and increasing up to the upper magnitude (+5.0). Both regions together define the (continuous) region of standard deviation from the mean value.


What is the 68-95-99.7 rule?

The 68-95-99.7 rule, or empirical rule, says this:for a normal distribution almost all values lie within 3 standard deviations of the mean.this means that approximately 68% of the values lie within 1 standard deviation of the mean (or between the mean minus 1 times the standard deviation, and the mean plus 1 times the standard deviation). In statistical notation, this is represented as: μ ± σ.And approximately 95% of the values lie within 2 standard deviations of the mean (or between the mean minus 2 times the standard deviation, and the mean plus 2 times the standard deviation). The statistical notation for this is: μ ± 2σ.Almost all (actually, 99.7%) of the values lie within 3 standard deviations of the mean (or between the mean minus 3 times the standard deviation and the mean plus 3 times the standard deviation). Statisticians use the following notation to represent this: μ ± 3σ.(www.wikipedia.org)


Standard deviation is helpful in calculating?

Standard deviation is a calculation. It I used in statistical analysis of a group of data to determine the deviation (the difference) between one datum point and the average of the group.For instance, on Stanford-Binet IQ tests, the average (or, mean) score is 100, and the standard deviation is 15. 65% of people will be within a standard deviation of the mean and score between 85 and 115 (100-15 and 100+15), while 95% of people will be within 2 standard deviations (30 points) of the mean -- between 70 and 130.