Common Distributions - Main Theorem
The Central Limit Theorem is perhaps the most important result in probability theory, explaining why the normal distribution appears ubiquitously in nature and justifying many statistical procedures.
Central Limit Theorem (CLT)
Let be independent and identically distributed random variables with mean and finite variance . Define:
Then as , the standardized sum converges in distribution to the standard normal:
Equivalently, for any :
This remarkable result states that sums (or averages) of IID random variables become approximately normal, regardless of the original distribution!
Roll a fair die times. Each roll has mean and variance .
Total sum has:
By CLT, .
Probability total is between 330 and 370:
Conditions and Extensions
Lindeberg-Lévy CLT: The basic form requires IID with finite variance.
Lyapunov CLT: Allows non-identical distributions under certain growth conditions on moments.
Berry-Esseen Theorem: Quantifies the rate of convergence:
where and . This bounds the approximation error.
For Bernoulli: , , .
For and :
The normal approximation is accurate to within about 9%.
Applications
Statistical Inference: The CLT justifies using normal-based confidence intervals and hypothesis tests for sample means, even when the population distribution is non-normal.
Quality Control: If individual measurements have mean and variance , the average of measurements has approximately:
The standard error decreases with sample size.
A factory produces bolts with mean length 10cm and std dev 0.5cm (unknown distribution). Taking sample of bolts:
Standard error: cm.
95% confidence interval for population mean:
If sample mean is 10.15cm, we're 95% confident true mean is in .
The CLT explains why the normal distribution dominates statistics: many measurable quantities are sums or averages of independent effects. Heights, test scores, measurement errors—all tend toward normality due to the CLT. This theorem is the theoretical foundation for the ubiquity of the Gaussian distribution in nature and statistics.