TheoremComplete

Second Moment Method

The second moment method uses variance to prove that a non-negative random variable is positive with high probability. While the first moment method shows Prโก[X>0]>0\Pr[X > 0] > 0 from E[X]>0\mathbb{E}[X] > 0 does not follow, the second moment method bridges this gap.


Statement

Theorem7.1Second Moment Method (Paley-Zygmund)

Let XX be a non-negative random variable with E[X]>0\mathbb{E}[X] > 0. Then: Prโก[X>0]โ‰ฅ(E[X])2E[X2].\Pr[X > 0] \geq \frac{(\mathbb{E}[X])^2}{\mathbb{E}[X^2]}. Equivalently, using Varโก(X)=E[X2]โˆ’(E[X])2\operatorname{Var}(X) = \mathbb{E}[X^2] - (\mathbb{E}[X])^2: Prโก[X=0]โ‰คVarโก(X)(E[X])2.\Pr[X = 0] \leq \frac{\operatorname{Var}(X)}{(\mathbb{E}[X])^2}.

Proof

By the Cauchy-Schwarz inequality applied to Xโ‹…1X>0X \cdot \mathbf{1}_{X > 0}: (E[X])2=(E[Xโ‹…1X>0])2โ‰คE[X2]โ‹…Prโก[X>0].(\mathbb{E}[X])^2 = (\mathbb{E}[X \cdot \mathbf{1}_{X > 0}])^2 \leq \mathbb{E}[X^2] \cdot \Pr[X > 0]. Rearranging gives the result. โ–ก\square

โ– 

Applications

ExampleClique Threshold in $G(n, p)$

For fixed kโ‰ฅ3k \geq 3, the threshold for G(n,p)G(n, p) to contain a copy of KkK_k is p=nโˆ’2/(kโˆ’1)p = n^{-2/(k-1)}. Let XX count copies of KkK_k. Then E[X]=(nk)p(k2)โ‰nkpk(kโˆ’1)/2\mathbb{E}[X] = \binom{n}{k} p^{\binom{k}{2}} \asymp n^k p^{k(k-1)/2}. For pโ‰ซnโˆ’2/(kโˆ’1)p \gg n^{-2/(k-1)}, E[X]โ†’โˆž\mathbb{E}[X] \to \infty, and the second moment method shows Prโก[X>0]โ†’1\Pr[X > 0] \to 1 by bounding E[X2]/(E[X])2โ†’1\mathbb{E}[X^2] / (\mathbb{E}[X])^2 \to 1.

RemarkControlling Correlations

The key challenge in the second moment method is bounding E[X2]=โˆ‘i,jE[XiXj]\mathbb{E}[X^2] = \sum_{i,j} \mathbb{E}[X_i X_j], where X=โˆ‘iXiX = \sum_i X_i is a sum of indicator variables. The "diagonal" terms (i=ji = j) contribute E[X]\mathbb{E}[X], while the off-diagonal terms require careful analysis of pairwise correlations.

ExampleConnectivity of $G(n, p)$

The sharp threshold for connectivity of G(n,p)G(n, p) is p=logโกnnp = \frac{\log n}{n}. Let XX be the number of isolated vertices. Then E[X]=n(1โˆ’p)nโˆ’1โˆผneโˆ’pn\mathbb{E}[X] = n(1-p)^{n-1} \sim ne^{-pn}. For p=logโกn+cnp = \frac{\log n + c}{n}, E[X]โ†’eโˆ’c\mathbb{E}[X] \to e^{-c}. The second moment method confirms XX is approximately Poisson, so Prโก[X=0]โ†’eโˆ’eโˆ’c\Pr[X = 0] \to e^{-e^{-c}}.

Theorem7.2Chebyshev's Inequality

For any random variable XX with finite variance and t>0t > 0: Prโก[โˆฃXโˆ’E[X]โˆฃโ‰ฅt]โ‰คVarโก(X)t2.\Pr[|X - \mathbb{E}[X]| \geq t] \leq \frac{\operatorname{Var}(X)}{t^2}. This is the simplest tool derived from the second moment and suffices when variance is small relative to the square of the expectation.