Then relative e ciency of ^ 1 relative to ^ 2, T. is some function. Properties of Good Estimators ¥In the Frequentist world view parameters are Þxed, statistics are rv and vary from sample to sample (i.e., have an associated sampling distribution) ¥In theory, there are many potential estimators for a population parameter ¥What are characteristics of good estimators? PROPERTIES OF ESTIMATORS (BLUE) KSHITIZ GUPTA 2. 1 Estimators. θ. Ë= T (X) be an estimator where . Small-Sample Estimator Properties Nature of Small-Sample Properties The small-sample, or finite-sample, distribution of the estimator Î²Ë j for any finite sample size N < â has 1. a mean, or expectation, denoted as E(Î²Ë j), and 2. a variance denoted as Var(Î²Ë j). Properties of estimators. X. be our data. WHAT IS AN ESTIMATOR? Let . Ë. random sample from a Poisson distribution with parameter . Relative e ciency (Def 9.1) Suppose ^ 1 and ^ 2 are two unbi-ased estimators for , with variances, V( ^ 1) and V(^ 2), respectively. This ï¬exibility in 2.4.1 Finite Sample Properties of the OLS and ML Estimates of ⢠Obtaining a point estimate of a population parameter ⢠Desirable properties of a point estimator: ⢠Unbiasedness ⢠Efficiency ⢠Obtaining a confidence interval for a mean when population standard deviation is known ⢠Obtaining a confidence interval for a mean when population standard deviation is ⦠θ. I When no estimator with desireable small-scale properties can be found, we often must choose between di erent estimators on the basis of asymptotic properties Indeed, any statistic is an estimator. We say that . But the sample mean Y is also an estimator of the popu-lation minimum. 1.1 Unbiasness. ECONOMICS 351* -- NOTE 3 M.G. The small-sample properties of the estimator Î²Ë j are defined in terms of the mean ( ) If we have a parametric family with parameter θ, then an estimator of θ is usually denoted by θË. Example: Suppose X 1;X 2; ;X n is an i.i.d. Some of the properties are defined relative to a class of candidate estimators, a set of possible T(") that we will denote by T. The density of an estimator T(") will be denoted (t, o), or when it is necessary to index the estimator, T(t, o). properties at the same time, and sometimes they can even be incompatible. This class of estimators has an important property. Abbott 2. Point estimators. If ^(x) is a maximum likelihood estimate for , then g( ^(x)) is a maximum likelihood estimate for g( ). ⢠In statistics, an estimator is a rule for calculating an estimate of a given quantity based on observed data ⢠Example- i. X follows a normal distribution, but we do not know the parameters of our distribution, namely mean (μ) and variance (Ï2 ) ii. We would like to have an estimator with smaller bias and smaller variance : if one can nd several unbiased estimators, we want to use an estimator with smaller vari-ance. Let . For example, if is a parameter for the variance and ^ is the maximum likelihood estimator, then p ^ is the maximum likelihood estimator for the standard deviation. is unbiased for . Only once weâve analyzed the sample minimum can we say for certain if it is a good estimator or not, but it is certainly a natural ï¬rst choice.