We introduce principles of point estimation, that is, the estimation of a value for the vector of unknown parameters of the density of a variate. The chapter starts by considering some desirable properties of point estimators, a sort of “the good, the bad, and the ugly” classification! The topics covered include bias, efficiency, mean-squared error (MSE), consistency, robustness, invariance, and admissibility. We then introduce methods of summarizing the data via statistics that retain the relevant sample information about the parameter vector, and we see how they achieve the desirable properties of estimators. We discuss sufficiency, Neyman's factorization, ancillarity, Rao-Blackwellization, completeness, the Lehmann–Scheffé theorem and the minimum-variance unbiasedness of an estimator, and Basu's theorem. We consider the exponential family and special cases and conclude by introducing the most common model in statistics, the linear model, which is used for illustrations in this chapter and is covered more extensively in the following chapters.
Review the options below to login to check your access.
Log in with your Cambridge Higher Education account to check access.
If you believe you should have access to this content, please contact your institutional librarian or consult our FAQ page for further information about accessing our content.