Variance is a measure of "relative to the mean, how far away does the other data fall" - it is a measure of dispersion. A high variance would indicate that your data is very much spread out over a large area (random), whereas a low variance would indicate that all your data is very similar.
Standard deviation (the square root of the variance) is a measure of "on average, how far away does the data fall from the mean". It can be interpreted in a similar way to the variance, but since it is square rooted, it is less susceptible to outliers.
Chat with our AI personalities
In statistics, this is the symbol for the "Variance"
Since this is regarding statistics I assume you mean lower case sigma (σ) which, in statistics, is the symbol used for standard deviation, and σ2 is known as the variance.
They are measures of the spread of the data and constitute one of the key descriptive statistics.
INFERENCES Any calculated number from a sample from the population is called a 'statistic', such as the mean or the variance.
actual budget/budget = variance%