ProofComplete

Joint and Conditional Distributions - Key Proof

We prove that for bivariate normal distributions, zero correlation implies independenceβ€”a property unique to the normal distribution.

Independence from Zero Correlation

Theorem

Let (X,Y)(X,Y) have bivariate normal distribution with correlation ρ\rho. Then: ρ=0β€…β€ŠβŸΊβ€…β€ŠXΒ andΒ YΒ areΒ independent\rho = 0 \iff X \text{ and } Y \text{ are independent}

Proof

Direction (⇐\Leftarrow): If XX and YY are independent, then: Cov(X,Y)=E[XY]βˆ’E[X]E[Y]=E[X]E[Y]βˆ’E[X]E[Y]=0\text{Cov}(X,Y) = E[XY] - E[X]E[Y] = E[X]E[Y] - E[X]E[Y] = 0

Therefore ρ=Cov(X,Y)ΟƒXΟƒY=0\rho = \frac{\text{Cov}(X,Y)}{\sigma_X\sigma_Y} = 0. βœ“

Direction (β‡’\Rightarrow): This is the interesting direction. Assume ρ=0\rho = 0.

The bivariate normal PDF is: f(x,y)=12πσXΟƒY1βˆ’Ο2exp⁑(βˆ’Q2(1βˆ’Ο2))f(x,y) = \frac{1}{2\pi\sigma_X\sigma_Y\sqrt{1-\rho^2}} \exp\left(-\frac{Q}{2(1-\rho^2)}\right)

where: Q=(xβˆ’ΞΌXΟƒX)2βˆ’2ρ(xβˆ’ΞΌX)(yβˆ’ΞΌY)ΟƒXΟƒY+(yβˆ’ΞΌYΟƒY)2Q = \left(\frac{x-\mu_X}{\sigma_X}\right)^2 - 2\rho\frac{(x-\mu_X)(y-\mu_Y)}{\sigma_X\sigma_Y} + \left(\frac{y-\mu_Y}{\sigma_Y}\right)^2

When ρ=0\rho = 0: Q=(xβˆ’ΞΌXΟƒX)2+(yβˆ’ΞΌYΟƒY)2Q = \left(\frac{x-\mu_X}{\sigma_X}\right)^2 + \left(\frac{y-\mu_Y}{\sigma_Y}\right)^2

Substituting: f(x,y)=12πσXΟƒYexp⁑(βˆ’12[(xβˆ’ΞΌXΟƒX)2+(yβˆ’ΞΌYΟƒY)2])f(x,y) = \frac{1}{2\pi\sigma_X\sigma_Y} \exp\left(-\frac{1}{2}\left[\left(\frac{x-\mu_X}{\sigma_X}\right)^2 + \left(\frac{y-\mu_Y}{\sigma_Y}\right)^2\right]\right)

Factoring the exponential: =12πσXexp⁑(βˆ’(xβˆ’ΞΌX)22ΟƒX2)β‹…12πσYexp⁑(βˆ’(yβˆ’ΞΌY)22ΟƒY2)= \frac{1}{\sqrt{2\pi}\sigma_X} \exp\left(-\frac{(x-\mu_X)^2}{2\sigma_X^2}\right) \cdot \frac{1}{\sqrt{2\pi}\sigma_Y} \exp\left(-\frac{(y-\mu_Y)^2}{2\sigma_Y^2}\right)

=fX(x)β‹…fY(y)= f_X(x) \cdot f_Y(y)

Since the joint PDF factors as the product of marginal PDFs, XX and YY are independent. β–‘

β– 

Counterexample for Non-Normal

Example

Let (X,Y)(X,Y) be uniformly distributed on the circle x2+y2=1x^2 + y^2 = 1.

By symmetry: E[X]=E[Y]=0E[X] = E[Y] = 0, E[XY]=0E[XY] = 0, so ρ=0\rho = 0.

But XX and YY are clearly dependent: knowing X=0.5X = 0.5 implies Y=Β±0.75Y = \pm\sqrt{0.75}, not any value!

Zero correlation β‡’ΜΈ\not\Rightarrow independence in general.

Implications

This result explains why multivariate normal distributions are so tractable:

  • Covariance matrix Ξ£\boldsymbol{\Sigma} completely characterizes dependence
  • If Ξ£\boldsymbol{\Sigma} is diagonal, components are independent
  • Correlation analysis suffices for detecting all dependencies
Remark

The equivalence of zero correlation and independence for bivariate (and multivariate) normal distributions is exceptional. For non-normal distributions, uncorrelated variables can be highly dependent. This distinction is crucial in data analysisβ€”correlation only captures linear relationships.