ConceptComplete

Common Distributions - Key Properties

Distribution families possess characteristic properties that enable their identification and facilitate calculations. Understanding these properties is crucial for both theoretical work and practical applications.

Closure Under Transformations

Many distribution families are closed under specific transformations.

Normal Distribution:

  • If XN(μ,σ2)X \sim \mathcal{N}(\mu, \sigma^2), then aX+bN(aμ+b,a2σ2)aX + b \sim \mathcal{N}(a\mu + b, a^2\sigma^2)
  • Sum: If XiN(μi,σi2)X_i \sim \mathcal{N}(\mu_i, \sigma_i^2) independently, then aiXiN(aiμi,ai2σi2)\sum a_i X_i \sim \mathcal{N}(\sum a_i\mu_i, \sum a_i^2\sigma_i^2)
  • The normal distribution is the only distribution closed under convolution with the same family

Poisson Distribution:

  • Sum: If XiPoisson(λi)X_i \sim \text{Poisson}(\lambda_i) independently, then XiPoisson(λi)\sum X_i \sim \text{Poisson}(\sum \lambda_i)

Gamma Distribution:

  • Sum: If XiGamma(αi,λ)X_i \sim \text{Gamma}(\alpha_i, \lambda) independently (same λ\lambda), then XiGamma(αi,λ)\sum X_i \sim \text{Gamma}(\sum \alpha_i, \lambda)
Example

If waiting times at two service points are independent Exponential(2)(2) and Exponential(3)(3), their sum follows: Gamma(1,2)+Gamma(1,3)=Gamma(2,?)\text{Gamma}(1, 2) + \text{Gamma}(1, 3) = \text{Gamma}(2, ?)

Wait—different rates! No simple Gamma sum. Instead, use convolution or MGF.

Memoryless Property

Theorem

The exponential distribution is the unique continuous distribution with the memoryless property: P(X>s+tX>s)=P(X>t)P(X > s + t | X > s) = P(X > t)

Similarly, the geometric distribution is the unique discrete memoryless distribution.

Proof Sketch: The memoryless property implies Fˉ(s+t)=Fˉ(s)Fˉ(t)\bar{F}(s+t) = \bar{F}(s)\bar{F}(t) where Fˉ(x)=P(X>x)\bar{F}(x) = P(X > x) is the survival function. The only continuous solution is Fˉ(x)=eλx\bar{F}(x) = e^{-\lambda x}, the exponential. □

Example

A light bulb has exponentially distributed lifetime with mean 1000 hours. After 500 hours of use, the probability it lasts another 500 hours is: P(X>1000X>500)=P(X>500)=e500/1000=e0.50.606P(X > 1000 | X > 500) = P(X > 500) = e^{-500/1000} = e^{-0.5} \approx 0.606

The bulb doesn't "remember" its age!

Relationships Between Distributions

Many distributions are related through transformations or limits:

Binomial → Poisson: As nn \to \infty, p0p \to 0 with npλnp \to \lambda: Binomial(n,p)Poisson(λ)\text{Binomial}(n,p) \to \text{Poisson}(\lambda)

Binomial → Normal: For large nn: Xnpnp(1p)N(0,1)where XBinomial(n,p)\frac{X - np}{\sqrt{np(1-p)}} \to \mathcal{N}(0,1) \quad \text{where } X \sim \text{Binomial}(n,p)

Poisson → Normal: For large λ\lambda: XλλN(0,1)where XPoisson(λ)\frac{X - \lambda}{\sqrt{\lambda}} \to \mathcal{N}(0,1) \quad \text{where } X \sim \text{Poisson}(\lambda)

Exponential → Gamma: Sum of nn independent Exponential(λ)(\lambda) gives Gamma(n,λ)(n, \lambda)

Chi-squared: If ZiN(0,1)Z_i \sim \mathcal{N}(0,1) independently: i=1nZi2χn2=Gamma(n/2,1/2)\sum_{i=1}^n Z_i^2 \sim \chi^2_n = \text{Gamma}(n/2, 1/2)

Example

For Binomial(100,0.5)(100, 0.5):

  • Exact: P(X=55)=(10055)(0.5)1000.0485P(X = 55) = \binom{100}{55} (0.5)^{100} \approx 0.0485
  • Normal approximation with continuity correction: P(54.5<Y<55.5) where YN(50,25)P(54.5 < Y < 55.5) \text{ where } Y \sim \mathcal{N}(50, 25) =Φ(55.5505)Φ(54.5505)Φ(1.1)Φ(0.9)0.0478= \Phi\left(\frac{55.5-50}{5}\right) - \Phi\left(\frac{54.5-50}{5}\right) \approx \Phi(1.1) - \Phi(0.9) \approx 0.0478

Excellent approximation!

Remark

These relationships reveal the deep interconnections in probability theory. Distributions don't exist in isolation—they form a coherent mathematical structure where one distribution approximates another under limiting conditions, and transformations map one family to another.