In: Statistics and Probability
Here In Statistical Inference the estimator is the function of sample and the parameter is the function of population. The estimate is the numeric value taken by estimator.
Here there are infinitely estimator ( which is function of sample) . For example, Mean, median, mode, standard deviation and variance etc. Are all the estimator.
Out of these not all estimator is good. When we are interested in estimation there are not all estimator is good estimator out of these we have to choose good estimator.
There are several properties of good estimator which is given below,
A good estimator, as common sense dictates, is close to the parameter being estimated. Its quality is to be evaluated in terms of the following properties:
1. Unbiasedness.
An estimator is said to be unbiased if its expected value is equal or same with the population parameter being estimated. That is if θ is an unbiased estimate of θ, then we must have E (θ) = θ. Many estimators are “Asymptotically unbiased” in the sense that the biases reduce to practically insignificant value (Zero) when n becomes sufficiently large.
2. Consistency.
If an estimator, say θ, approaches the parameter θ closer and
closer as the sample size n increases, θ is said to be a consistent
estimator of θ. Stating somewhat more rigorously, the estimator θ
is said is be a consistent estimator of θ if, as n approaches
infinity, the probability approaches 1 that θ will differ from the
parameter θ by no more than an arbitrary constant.
The sample mean is an unbiased estimator of µ no matter what form
the population distribution assumes, while the sample median is an
unbiased estimate of µ only if the population distribution is
symmetrical. The sample mean is better than the sample median as an
estimate of µ in terms of both unbiasedness and consistency.
3. Efficiency.
The concept of efficiency refers to the sampling variability of an estimator. If two competing estimators are both unbiased, the one with the smaller variance (for a given sample size) is said to be relatively more efficient. Stated in a somewhat different language, an estimator θ is said to be more efficient than another estimator θ2 for θ if the variance of the first is less than the variance of the second. The smaller the variance of the estimator, the more concentrated is the distribution of the estimator around the parameter being estimated and, therefore, the better this estimator is.
4. Sufficiency.
An estimator is said to be sufficient if it conveys much
information as is possible about the parameter which is contained
in the sample. The significance of sufficiency lies in the fact
that if a sufficient estimator exists, it is absolutely unnecessary
to considered any other estimator; a sufficient estimator ensures
that all information a sample a sample can furnished with respect
to the estimation of a parameter is being utilized.
Many methods have been devised for estimating parameters that may
provide estimators satisfying these properties. The two important
methods are the least square method and the method of maximum
likelihood.
These are the properties of good estimator in common Statistics. These factor is vary important for an estimator because these are significantly contributed for estimation of population parameter.
Hope you understood what is estimator and what are the properties of good estimator.
If you understood then RATE POSITIVE ?. In case of any queries please feel free to ask in comment box. Thank you.