If the probability distribution function for the random variable X is f(x), then
first calculate E(X) = integral of x*f(x)dx over the whole real line.
Noxt calculate E(X2) = integral of x2*f(x)dx over the whole real line.
Then Variance(X) = E(X2) - [E(X)]2
and finally, SD(X) = sqrt[Variance(X)].
Chat with our AI personalities
It depends what you're asking. The question is extremely unclear. Accuracy of what exactly? Even in the realm of statistics an entire book could be written to address such an ambiguous question (to answer a myriad of possible questions). If you simply are asking what the relationship between the probability that something will occur given the know distribution of outcomes (such as a normal distribution), the mean of that that distribution, and the the standard deviation, then the standard deviation as a represents the spread of the curve of probability. This means that if you had a cure where 0 was the mean, and 3 was the standard deviation, the likelihood of observing a value of 12 (or -12) would be likely inaccurate if that was your prediction. However, if you had a mean of 0 and a standard deviation of 100, the likelihood of observing of a 12 (or -12) would be quite likely. This is simply because the standard deviation provides a simple representation of the horizontal spread of probability on the x-axis.
For data sets having a normal distribution, the following properties depend on the mean and the standard deviation. This is known as the Empirical rule. About 68% of all values fall within 1 standard deviation of the mean About 95% of all values fall within 2 standard deviation of the mean About 99.7% of all values fall within 3 standard deviation of the mean. So given any value and given the mean and standard deviation, one can say right away where that value is compared to 60, 95 and 99 percent of the other values. The mean of the any distribution is a measure of centrality, but in case of the normal distribution, it is equal to the mode and median of the distribtion. The standard deviation is a measure of data dispersion or variability. In the case of the normal distribution, the mean and the standard deviation are the two parameters of the distribution, therefore they completely define the distribution. See: http://en.wikipedia.org/wiki/Normal_distribution
idk about normal distribution but for Mean "M" = (overall sum of "x") / "n" frequency distribution: 'M' = Overall sum of (' x ' * ' f ') / overall sum of ( ' f ' ) M = Mean x = Mid Point f = frequiency n = number of variables ALL FOR STANDARD DEVIATION * * * * * A general Normal distribution is usually described in terms of its parameters, and given as N(mu, sigma2) where mu is the mean and sigma is the standard deviation. The STANDARD Normal distribution is the N(0, 1) distribution, that is, it has mean = 0 and variance (or standard deviation) = 1.
It is a variable that can take a number of different values. The probability that it takes a value in any given range is determined by a random process and the value of that probability is given by the probability distribution function.It is a variable that can take a number of different values. The probability that it takes a value in any given range is determined by a random process and the value of that probability is given by the probability distribution function.It is a variable that can take a number of different values. The probability that it takes a value in any given range is determined by a random process and the value of that probability is given by the probability distribution function.It is a variable that can take a number of different values. The probability that it takes a value in any given range is determined by a random process and the value of that probability is given by the probability distribution function.
A single number, such as 478912, always has a standard deviation of 0.