answersLogoWhite

0

"Better" compared to what ? ! ?

If you just want to stand out among the group, look better than the people

around you, and be admired as the least stupid one in the bunch, then maybe

the 68 would be great. You scored two standard deviations above the mean,

and you're officially the One-eyed Man in the Kingdom of the Blind, the Big Fish

in that particular Little Pond, and like Noah himself, perfect in that generation.

You put the 68 in your pocket, and you go out looking for anybody you can find

who scored lower than you did.

But if you want to learn stuff, know stuff, and be able to use the stuff you learn,

then you only want to compare your score to the exam itself, not to anybody else's

performance. A score of 80 is always better than a score of 68, regardless of how

anybody else is doing. You put the 80 in your pocket, you reward yourself with

a chocolate bar for a job well done, and you make up your mind to see what

you can do about those other 20, for the next exam.

User Avatar

Wiki User

13y ago

What else can I help you with?

Related Questions

When is a t test better than a z score?

When you don't have the population standard deviation, but do have the sample standard deviation. The Z score will be better to do as long as it is possible to do it.


Which is better a score of 92 on a test with a mean of 71 and a standard deviation of 15 or a score of 688 on a test with a mean of 493 and a standard deviation of 150?

score of 92


Average usmle score?

mean is 218 with a standard deviation of 16


Is 119 an average id score for a 7 year old?

An average IQ score for a 7 year old is typically around 100, so a score of 119 would be considered above average. IQ scores are standardized to have a mean of 100 with a standard deviation of 15, so a score of 119 would fall about 1 standard deviation above the average.


How do you find the mean from raw score z score and standard deviation?

To find the mean from a raw score, z-score, and standard deviation, you can use the formula: ( \text{Raw Score} = \text{Mean} + (z \times \text{Standard Deviation}) ). Rearranging this gives you the mean: ( \text{Mean} = \text{Raw Score} - (z \times \text{Standard Deviation}) ). Simply substitute the values of the raw score, z-score, and standard deviation into this formula to calculate the mean.


What if a standard score is 57 and the average is 100 Is that three standard deviations below the mean or almost three?

The answer depends on what the standard deviation is.


What happens to the standard score as the standard deviation increase?

The standardised score decreases.


Standard score chart on the Beery VMI?

the VMI has a mean score of 100 with standard deviation of 15. So scores between 85-115 are considered average.


Why use the T score?

T-score is used when you don't have the population standard deviation and must use the sample standard deviation as a substitute.


How do you calculate standard deviation with the help of z-score?

A z-score cannot help calculate standard deviation. In fact the very point of z-scores is to remove any contribution from the mean or standard deviation.


What happens to the standard score as the standard deviation increases?

The absolute value of the standard score becomes smaller.


Standard deviation is helpful in calculating?

Standard deviation is a calculation. It I used in statistical analysis of a group of data to determine the deviation (the difference) between one datum point and the average of the group.For instance, on Stanford-Binet IQ tests, the average (or, mean) score is 100, and the standard deviation is 15. 65% of people will be within a standard deviation of the mean and score between 85 and 115 (100-15 and 100+15), while 95% of people will be within 2 standard deviations (30 points) of the mean -- between 70 and 130.