TheoremComplete

Janson Inequalities

The Janson inequalities provide precise estimates for the probability that none of a collection of "bad" events occurs in a product probability space. They are particularly effective for subgraph containment problems in random graphs.


Setup and Statement

Theorem7.3Janson's Inequality

Let Ω=iIΩi\Omega = \prod_{i \in I} \Omega_i be a product probability space. For each αA\alpha \in \mathcal{A} (a finite index set), let AαA_\alpha be an event determined by coordinates in SαIS_\alpha \subseteq I. Write αβ\alpha \sim \beta if SαSβS_\alpha \cap S_\beta \neq \emptyset and αβ\alpha \neq \beta. Define: μ=αPr[Aα],Δ=αβPr[AαAβ].\mu = \sum_{\alpha} \Pr[A_\alpha], \quad \Delta = \sum_{\alpha \sim \beta} \Pr[A_\alpha \cap A_\beta]. Then: Pr[αAα]exp(μ+Δ2).\Pr\left[\bigcap_\alpha \overline{A_\alpha}\right] \leq \exp\left(-\mu + \frac{\Delta}{2}\right). If additionally Δμ\Delta \leq \mu, then: Pr[αAα]exp(μΔ).\Pr\left[\bigcap_\alpha \overline{A_\alpha}\right] \geq \exp\left(-\mu - \Delta\right).


Extended Janson Inequality

Theorem7.4Extended Janson (Kim-Vu)

Under the same setup, if Pr[Aα]ε\Pr[A_\alpha] \leq \varepsilon for all α\alpha and Δδμ\Delta \leq \delta \mu, then: Pr[αAα]=exp(μ(1+O(δ+ε))).\Pr\left[\bigcap_\alpha \overline{A_\alpha}\right] = \exp(-\mu(1 + O(\delta + \varepsilon))). This provides a near-exact estimate when individual event probabilities are small and the total correlation Δ\Delta is small relative to μ\mu.

ExampleTriangle-Free Random Graphs

Let XX count triangles in G(n,p)G(n, p). Then μ=(n3)p3\mu = \binom{n}{3} p^3 and Δ=Θ(n4p5)\Delta = \Theta(n^4 p^5) (pairs of triangles sharing an edge). When p=o(n1/2)p = o(n^{-1/2}), Δ=o(μ)\Delta = o(\mu), and Janson's inequality gives: Pr[X=0]=exp(μ(1+o(1)))=exp(Θ(n3p3)).\Pr[X = 0] = \exp(-\mu(1 + o(1))) = \exp\left(-\Theta(n^3 p^3)\right).

RemarkComparison with LLL

Janson's inequality and the Lovász Local Lemma address similar problems (avoiding bad events) but with different strengths. The LLL gives a positive probability under weaker conditions but provides no quantitative bound, while Janson's inequality requires a product space but gives precise exponential estimates. For random graph applications, Janson is typically preferred.

ExampleSmall Subgraph Containment

For a fixed graph HH with vv vertices and ee edges, the number of copies of HH in G(n,p)G(n, p) has expectation μnvpe\mu \asymp n^v p^e. When pnv/ep \ll n^{-v/e} (below the threshold), Janson's inequality shows Pr[no copy of H]=exp(Θ(μ))\Pr[\text{no copy of } H] = \exp(-\Theta(\mu)), confirming Poisson-like behavior.