answersLogoWhite

0

It is called a standard normal distribution.

User Avatar

Wiki User

14y ago

What else can I help you with?

Related Questions

A standard normal distribution has a mean of and standard deviation of?

Mean 0, standard deviation 1.


How does standard normal distribution differ from normal distribution?

The standard normal distribution has a mean of 0 and a standard deviation of 1.


Is the standard normal distribution has a mean of 1 and standard deviation of 0?

No.


What is the distribution with a mean of 0 and a standard deviation of 1?

It is any standardised distribution.


What is the distribution with a mean of 0 and a standard deviation of 1 called?

standard normal


Which normal distribution is also the standard normal curve?

The normal distribution would be a standard normal distribution if it had a mean of 0 and standard deviation of 1.


In a standard normal distribution what is the value of the mean and standard deviation?

idk about normal distribution but for Mean "M" = (overall sum of "x") / "n" frequency distribution: 'M' = Overall sum of (' x ' * ' f ') / overall sum of ( ' f ' ) M = Mean x = Mid Point f = frequiency n = number of variables ALL FOR STANDARD DEVIATION * * * * * A general Normal distribution is usually described in terms of its parameters, and given as N(mu, sigma2) where mu is the mean and sigma is the standard deviation. The STANDARD Normal distribution is the N(0, 1) distribution, that is, it has mean = 0 and variance (or standard deviation) = 1.


What is standard deviation of 155.45?

The standard deviation is 0.


What is true of a standard normal probability distribution aA mean of 0 and a standard deviation of 1 b A mean of 1 and any standard deviation of 1 c Any mean and a standard deviatio?

a is true.


What is the standard deviation for 1.3?

The standard deviation for a single observation is 0.


What would it mean if a standard deviation was calculated to equal 0?

A standard deviation of zero means that all the data points are the same value.


What is the mean and standard deviation of a distribution of T-scores?

The answer depends on the degrees of freedom (df). If the df > 1 then the mean is 0, and the standard deviation, for df > 2, is sqrt[df/(df - 2)].