TheoremComplete

The Cramer-Rao Lower Bound

The Cramer-Rao inequality establishes a fundamental lower limit on the variance of any unbiased estimator, providing a benchmark against which the efficiency of estimators can be measured.


Fisher Information

Definition

The Fisher information of a single observation Xf(x;θ)X \sim f(x; \theta) is I(θ)=E[(θlogf(X;θ))2]=E[2θ2logf(X;θ)]I(\theta) = E\left[\left(\frac{\partial}{\partial \theta} \log f(X; \theta)\right)^2\right] = -E\left[\frac{\partial^2}{\partial \theta^2} \log f(X; \theta)\right] The second equality holds under regularity conditions (interchangeability of differentiation and integration). For nn i.i.d. observations, the total Fisher information is In(θ)=nI(θ)I_n(\theta) = nI(\theta).


The Inequality

Theorem8.4Cramer-Rao Lower Bound

Let θ^\hat{\theta} be an unbiased estimator of θ\theta based on X1,,Xnf(x;θ)X_1, \ldots, X_n \sim f(x; \theta), where the model satisfies regularity conditions (the support does not depend on θ\theta, differentiation under the integral is valid, and I(θ)>0I(\theta) > 0). Then Var(θ^)1nI(θ)\operatorname{Var}(\hat{\theta}) \geq \frac{1}{nI(\theta)} An unbiased estimator achieving this bound is called efficient or a minimum variance unbiased estimator (MVUE).

ExampleFisher information for common distributions
  • Normal N(μ,σ2)N(\mu, \sigma^2): I(μ)=1/σ2I(\mu) = 1/\sigma^2. The CRLB gives Var(μ^)σ2/n\operatorname{Var}(\hat{\mu}) \geq \sigma^2/n, achieved by Xˉ\bar{X}.
  • Bernoulli Ber(p)\text{Ber}(p): I(p)=1/(p(1p))I(p) = 1/(p(1-p)). The CRLB gives Var(p^)p(1p)/n\operatorname{Var}(\hat{p}) \geq p(1-p)/n, achieved by p^=Xˉ\hat{p} = \bar{X}.
  • Poisson Poi(λ)\text{Poi}(\lambda): I(λ)=1/λI(\lambda) = 1/\lambda. The CRLB gives Var(λ^)λ/n\operatorname{Var}(\hat{\lambda}) \geq \lambda/n, achieved by Xˉ\bar{X}.

Efficiency and Asymptotic Theory

Theorem8.5Asymptotic Efficiency of MLE

Under regularity conditions, the MLE is asymptotically efficient: n(θ^MLEθ)dN(0,1/I(θ))\sqrt{n}(\hat{\theta}_{MLE} - \theta) \xrightarrow{d} N(0, 1/I(\theta)). That is, the MLE achieves the Cramer-Rao bound asymptotically.

RemarkWhen the bound is not achievable

Not every parameter has an efficient estimator. The CRLB may not be achievable in finite samples, particularly for non-exponential-family distributions. In such cases, the Rao-Blackwell theorem and completeness provide alternative routes to finding the MVUE via sufficient statistics.