ProofComplete

Proof of the Central Limit Theorem

We prove the Central Limit Theorem using characteristic functions (Fourier transforms), the standard approach in modern probability theory.


Proof

Theorem (CLT): Let X1,X2,X_1, X_2, \ldots be i.i.d. with E[Xi]=0E[X_i] = 0, Var(Xi)=σ2>0\operatorname{Var}(X_i) = \sigma^2 > 0. Then Sn=X1++XnσndN(0,1)S_n = \frac{X_1 + \cdots + X_n}{\sigma\sqrt{n}} \xrightarrow{d} N(0,1).

Step 1: Characteristic functions.

The characteristic function of a random variable XX is φX(t)=E[eitX]\varphi_X(t) = E[e^{itX}]. By Levy's continuity theorem, XndXX_n \xrightarrow{d} X if and only if φXn(t)φX(t)\varphi_{X_n}(t) \to \varphi_X(t) pointwise for all tRt \in \mathbb{R}.

The characteristic function of N(0,1)N(0,1) is φZ(t)=et2/2\varphi_{Z}(t) = e^{-t^2/2}.

Step 2: Compute the characteristic function of SnS_n.

Let Yi=Xi/(σn)Y_i = X_i / (\sigma\sqrt{n}), so Sn=Y1++YnS_n = Y_1 + \cdots + Y_n. By independence: φSn(t)=i=1nφYi(t)=[φY1(t)]n=[φX1(tσn)]n\varphi_{S_n}(t) = \prod_{i=1}^n \varphi_{Y_i}(t) = [\varphi_{Y_1}(t)]^n = \left[\varphi_{X_1}\left(\frac{t}{\sigma\sqrt{n}}\right)\right]^n

Step 3: Taylor expansion of the characteristic function.

Since E[X1]=0E[X_1] = 0 and E[X12]=σ2E[X_1^2] = \sigma^2, the characteristic function of X1X_1 has the expansion: φX1(s)=E[eisX1]=1+isE[X1]+(is)22E[X12]+o(s2)\varphi_{X_1}(s) = E[e^{isX_1}] = 1 + is \cdot E[X_1] + \frac{(is)^2}{2} E[X_1^2] + o(s^2) =1σ2s22+o(s2)as s0= 1 - \frac{\sigma^2 s^2}{2} + o(s^2) \quad \text{as } s \to 0

Substituting s=t/(σn)s = t/(\sigma\sqrt{n}): φX1(tσn)=1σ22t2σ2n+o(1n)=1t22n+o(1n)\varphi_{X_1}\left(\frac{t}{\sigma\sqrt{n}}\right) = 1 - \frac{\sigma^2}{2} \cdot \frac{t^2}{\sigma^2 n} + o\left(\frac{1}{n}\right) = 1 - \frac{t^2}{2n} + o\left(\frac{1}{n}\right)

Step 4: Take the nn-th power.

φSn(t)=[1t22n+o(1n)]n\varphi_{S_n}(t) = \left[1 - \frac{t^2}{2n} + o\left(\frac{1}{n}\right)\right]^n

Using the fundamental limit limn(1+an/n)n=ea\lim_{n\to\infty}(1 + a_n/n)^n = e^a when anaa_n \to a:

limnφSn(t)=et2/2\lim_{n \to \infty} \varphi_{S_n}(t) = e^{-t^2/2}

More rigorously: write φX1(t/σn)=1+wn\varphi_{X_1}(t/\sigma\sqrt{n}) = 1 + w_n where wn=t2/(2n)+o(1/n)w_n = -t^2/(2n) + o(1/n). Then log(1+wn)=wn+O(wn2)=t2/(2n)+o(1/n)\log(1 + w_n) = w_n + O(w_n^2) = -t^2/(2n) + o(1/n), so nlog(1+wn)t2/2n \log(1 + w_n) \to -t^2/2, giving φSn(t)=enlog(1+wn)et2/2\varphi_{S_n}(t) = e^{n\log(1+w_n)} \to e^{-t^2/2}.

Step 5: Apply Levy's continuity theorem.

Since φSn(t)et2/2=φZ(t)\varphi_{S_n}(t) \to e^{-t^2/2} = \varphi_{Z}(t) pointwise and et2/2e^{-t^2/2} is continuous at t=0t = 0, Levy's continuity theorem guarantees SndZN(0,1)S_n \xrightarrow{d} Z \sim N(0,1). \square


RemarkWhy characteristic functions?

The proof via characteristic functions is elegant because: (1) the characteristic function of a sum of independent variables is the product of their characteristic functions, converting convolution to multiplication; (2) pointwise convergence of characteristic functions is equivalent to convergence in distribution (Levy's theorem); (3) the Taylor expansion naturally produces the Gaussian et2/2e^{-t^2/2}.