The Cramer-Rao Lower Bound
The Cramer-Rao inequality establishes a fundamental lower limit on the variance of any unbiased estimator, providing a benchmark against which the efficiency of estimators can be measured.
Fisher Information
The Fisher information of a single observation is The second equality holds under regularity conditions (interchangeability of differentiation and integration). For i.i.d. observations, the total Fisher information is .
The Inequality
Let be an unbiased estimator of based on , where the model satisfies regularity conditions (the support does not depend on , differentiation under the integral is valid, and ). Then An unbiased estimator achieving this bound is called efficient or a minimum variance unbiased estimator (MVUE).
- Normal : . The CRLB gives , achieved by .
- Bernoulli : . The CRLB gives , achieved by .
- Poisson : . The CRLB gives , achieved by .
Efficiency and Asymptotic Theory
Under regularity conditions, the MLE is asymptotically efficient: . That is, the MLE achieves the Cramer-Rao bound asymptotically.
Not every parameter has an efficient estimator. The CRLB may not be achievable in finite samples, particularly for non-exponential-family distributions. In such cases, the Rao-Blackwell theorem and completeness provide alternative routes to finding the MVUE via sufficient statistics.