Modes of Convergence
There are several notions of convergence for sequences of random variables, each with different strengths and applications. Understanding their relationships is essential for rigorous probability theory.
Four Modes of Convergence
Let be random variables on a probability space .
- Almost sure convergence ():
- Convergence in probability (): For all ,
- Convergence in (mean) ():
- Convergence in distribution (): at all continuity points of
Relationships
The following implications hold: None of the reverse implications hold in general, though:
- implies there exists a subsequence with
- (a constant) implies
The "typewriter sequence": on with Lebesgue measure, let where . Then (since the intervals shrink) but does not converge a.s. to (every point is eventually covered by infinitely many intervals).
Slutsky's Theorem
If and (a constant), then:
- (if )
If and is a continuous function, then . Combined with Slutsky's theorem, this allows one to derive the limiting distribution of complex statistics from simpler ones — a technique used extensively in asymptotic statistics.