Martingale Convergence Theorems
The martingale convergence theorems are fundamental results stating that, under appropriate conditions, martingales converge almost surely to a limit. These theorems underpin the theory of stochastic processes, providing conditions for long-run convergence of random sequences.
Doob's martingale convergence theorem
Let be a submartingale (or supermartingale) with . Then there exists a random variable with such that:
For martingales: If is a martingale with (bounded in ), then a.s.
Intuition: A martingale with bounded variation cannot "wander off" indefinitely β it must converge to a limit. The key is that the expected absolute value (or second moment) is uniformly bounded, preventing the process from escaping to infinity.
An urn initially contains 1 red ball and 1 blue ball. At each step, draw a ball uniformly at random, then return it along with one additional ball of the same color. Let be the number of red balls after draws and the proportion of red balls.
Then is a martingale: . Moreover, , so for all . By Doob's theorem, a.s. for some random variable .
Remarkably, : the limiting proportion of red balls is uniformly distributed on . This is a classic example of exchangeability and the de Finetti theorem.
convergence
If is a martingale with , then:
- almost surely.
- in : .
- .
convergence is stronger than almost sure convergence: it implies convergence of second moments and allows us to pass limits through expectations.
Let be a simple random walk and the exit time from . Define (the stopped process). Then:
- is a martingale (stopped processes preserve the martingale property).
- for all , so .
By the convergence theorem, a.s. and in . Since a.s., .
Uniform integrability
A family of random variables is uniformly integrable (UI) if:
Equivalently, and for any , there exists such that implies for all .
If is a martingale and the family is uniformly integrable, then:
- almost surely.
- in : .
- .
Uniform integrability is the key to convergence. It ensures that the "tails" of the distribution do not contribute significantly, allowing us to pass limits through expectations.
If a.s. for all , then is uniformly integrable (trivially: for ). Hence, bounded martingales converge in .
Doob's upcrossing inequality
Let be a submartingale and the number of upcrossings of the interval by (i.e., the number of times the process crosses from below to above ). Then:
The upcrossing inequality is the key tool in proving Doob's convergence theorem. If is uniformly bounded, then the expected number of upcrossings is finite, which implies that cannot oscillate indefinitely β it must converge.
Backwards martingales
A sequence indexed by negative integers is a backwards martingale if:
Equivalently, with reversed indexing, is a forwards martingale.
Every backwards martingale converges almost surely and in : there exists such that a.s. as .
No integrability condition is needed for backwards martingales β they always converge! This is because moving backwards in time provides more and more information, naturally leading to a limit.
Let be an exchangeable sequence of -valued random variables. Define . Then is a backwards martingale (as decreases, we condition on more information). By the backwards convergence theorem, a.s.
The de Finetti theorem states that for some random variable , and conditionally on , the are i.i.d. Bernoulli. This is the foundation of Bayesian statistics: exchangeable sequences arise from a mixture of i.i.d. sequences.
Summary
Martingale convergence theorems provide conditions for almost sure and convergence:
- Doob's theorem: If , then a.s.
- convergence: If , then in .
- convergence: If is uniformly integrable, then in .
- Backwards martingales: Always converge (no integrability needed).
These results are central to probability theory, with applications to ergodic theory, statistical inference, and stochastic analysis.