Well, honey, a good estimator needs to have a keen eye for details, a solid understanding of the project scope, excellent math skills, and the ability to make educated guesses without breaking a sweat. Basically, they need to be part Sherlock Holmes, part human calculator, and all-around badass at predicting costs. So, if you're looking for someone to estimate your project like a pro, make sure they've got these qualities in spades.
Chat with our AI personalities
A "Good" estimator is the one which provides an estimate with the following qualities:
Unbiasedness: An estimate is said to be an unbiased estimate of a given parameter when the expected value of that estimator can be shown to be equal to the parameter being estimated. For example, the mean of a sample is an unbiased estimate of the mean of the population from which the sample was drawn. Unbiasedness is a good quality for an estimate, since, in such a case, using weighted average of several estimates provides a better estimate than each one of those estimates. Therefore, unbiasedness allows us to upgrade our estimates. For example, if your estimates of the population mean µ are say, 10, and 11.2 from two independent samples of sizes 20, and 30 respectively, then a better estimate of the population mean µ based on both samples is [20 (10) + 30 (11.2)] (20 + 30) = 10.75.
Consistency: The standard deviation of an estimate is called the standard error of that estimate. The larger the standard error the more error in your estimate. The standard deviation of an estimate is a commonly used index of the error entailed in estimating a population parameter based on the information in a random sample of size n from the entire population.
An estimator is said to be "consistent" if increasing the sample size produces an estimate with smaller standard error. Therefore, your estimate is "consistent" with the sample size. That is, spending more money to obtain a larger sample produces a better estimate.
Efficiency: An efficient estimate is one which has the smallest standard error among all unbiased estimators.
The "best" estimator is the one which is the closest to the population parameter being estimated.
In statistics, an efficient estimator is an estimator that estimates the quantity of interest in some "best possible" manner
I believe you want to say, "as the sample size increases" I find this definition on Wikipedia that might help: In statistics, a consistent sequence of estimators is one which converges in probability to the true value of the parameter. Often, the sequence of estimators is indexed by sample size, and so the consistency is as sample size (n) tends to infinity. Often, the term consistent estimator is used, which refers to the whole sequence of estimators, resp. to a formula that is used to obtain a term of the sequence. So, I don't know what you mean by "the value of the parameter estimated F", as I think you mean the "true value of the parameter." A good term for what the estimator is attempting to estimate is the "estimand." You can think of this as a destination, and your estimator is your car. Now, if you all roads lead eventually to your destination, then you have a consistent estimator. But if it is possible that taking one route will make it impossible to get to your destination, no matter how long you drive, then you have an inconsistent estimator. See related links.
standard error
Coefficient of varation
The sample mean is an unbiased estimator of the population mean because the average of all the possible sample means of size n is equal to the population mean.