answersLogoWhite

0


Best Answer

About 81.5%

User Avatar

Wiki User

9y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What percentage of data would fall within 1.75 standard deviations of the mean?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

How do you multiply standard deviations?

Multiply them as you would any two numbers. However, you should note that the standard deviation of a product of two variables is not the product of their standard deviations. That is, SD(XY) ≠ SD(X)*SD(Y)


What is abnormal distribution?

A standard distribution regards 95% of all data being within 2-standard deviations of either side. Similarly, within one standard deviation either way is 68% of all data. This creates a bell curve distribution. An abnormal distribution would be erratic and not follow such a statistical structure of representation.


What would -2 standard deviation below the mean be?

It would mean that the result was 2 standard deviations above the mean. Depending on the distribution of the variable, it may be possible to attach a probability to this, or more extreme, observations.It would mean that the result was 2 standard deviations above the mean. Depending on the distribution of the variable, it may be possible to attach a probability to this, or more extreme, observations.It would mean that the result was 2 standard deviations above the mean. Depending on the distribution of the variable, it may be possible to attach a probability to this, or more extreme, observations.It would mean that the result was 2 standard deviations above the mean. Depending on the distribution of the variable, it may be possible to attach a probability to this, or more extreme, observations.


How would you identify and report deviations and what is the significance of deviations?

identify and report deviations


What measurements fall beyond three standard deviations from the mean?

Usually they would be observations with very low probabilities of occurrence.


What is the use of statistical mean in industry?

For different sets of data, the mean would be the summation of all observations, which are normally subdivided by the observation numbers. The mean value would frequently be quoted with standard deviations: mean would describe data central locations then standard deviations illustrate the spread. Substitute dispersion measures include mean variations that are always equal to average absolute deviations from the mean values. It is minimally responsive to the outliers. Hope this helps.


How many standard deviations is needed to capture 75 percent of data?

It depends on the shape of the distribution. For standard normal distribution, a two tailed range would be from -1.15 sd to + 1.15 sd.


How do you find the sum of squared deviations in a set of numbers?

It would be useful to know what the deviations were from.


Why is the sample deviation divided by n-1in business statistics?

The purpose in computing the sample standard deviation is to estimate the amount of spread in the population from which the samples are drawn. Ideally, therefore, we would compute deviations from the mean of all the items in the population, rather than the deviations from the sample mean. However the population mean is generally unknown, so the sample mean would be used in place. It is a mathematical fact that the deviations around the sample mean tend to be a bit smaller than the deviations around the population mean and by dividing by n-1 rather than n provide the exactly the right amount of correction.


How smart are you when your IQ's 145?

What an IQ of 145 means really depends on the test. On some tests it might means that you are smarter than 99.85% of the population, on others it might mean that you are brighter than about 80% of the population. Modern IQ tests tend to be designed to give a normal distribution of scores with 100 as the mean. A normal distribution is a bell shape, so that the closer the IQ is to 100, the more people there are with that IQ. Exactly how many for a given IQ depends on something called the standard deviation. About two thirds of people have an IQ within 1 standard deviation of 100 (the mean). For example, IQ tests commonly have a standard deviation of about +/-15. This means about two thirds of people have an IQ between 85 and 115. You might call this the average range. About 95% of people will be within two standard deviations, so using the same example, about 95% of people will have an IQ between 70 and 130. And 99.7% within 3 standard deviations. So, on an IQ test with a standard deviation of +/-15, you might say that people with an IQ of 130 or more are above average (in the top 15% or so), and if your IQ is 145 then you are in the top 0.15% of the population. However, the standard deviation depends on the test. Standard deviations on common tests range from 10 to 24. Because of this, these days psychologists tend to talk of percentile ranges when talking about IQ with a certain confidence interval. So, you would be far more likely to be told that your IQ is in the 94% percentile range with a confidence interval of 90%


What does precision processor mean regarding your IQ test?

It means the test replaced standard deviations with fancy titles. Those tests are utter nonsense and do not correlate with scores you would receive on a real normed IQ test.


If average height for women is normally distributed with a mean of 65 inches and a standard deviation of 2.5 inches then approximately 95 percent of all women should be between what and what inches?

A normal distribution with a mean of 65 and a standard deviation of 2.5 would have 95% of the population being between 60 and 70, i.e. +/- two standard deviations.