TheoremComplete

Random Variables and Distributions - Applications

Random variable transformations and distribution relationships provide powerful tools for deriving new distributions from known ones. These connections simplify calculations and reveal deep mathematical structure.

Probability Integral Transform

Theorem

Let XX be a continuous random variable with CDF FXF_X (assumed strictly increasing and continuous). Then: U=FX(X)Uniform(0,1)U = F_X(X) \sim \text{Uniform}(0,1)

Conversely, if UUniform(0,1)U \sim \text{Uniform}(0,1), then: X=FX1(U)X = F_X^{-1}(U) has CDF FXF_X.

Proof: Let U=FX(X)U = F_X(X). For 0u10 \leq u \leq 1: P(Uu)=P(FX(X)u)=P(XFX1(u))=FX(FX1(u))=uP(U \leq u) = P(F_X(X) \leq u) = P(X \leq F_X^{-1}(u)) = F_X(F_X^{-1}(u)) = u

This is the CDF of Uniform(0,1)(0,1). □

Example

Inverse Transform Sampling: To generate samples from any distribution FXF_X using a standard uniform random number generator:

  1. Generate UUniform(0,1)U \sim \text{Uniform}(0,1)
  2. Return X=FX1(U)X = F_X^{-1}(U)

For Exponential(λ)\text{Exponential}(\lambda): FX(x)=1eλxF_X(x) = 1 - e^{-\lambda x}, so: X=FX1(U)=1λln(1U)X = F_X^{-1}(U) = -\frac{1}{\lambda} \ln(1-U)

Since 1UUniform(0,1)1-U \sim \text{Uniform}(0,1) when UU does, we can use X=1λlnUX = -\frac{1}{\lambda} \ln U.

Order Statistics

Definition

Given random variables X1,,XnX_1, \ldots, X_n, the order statistics X(1)X(2)X(n)X_{(1)} \leq X_{(2)} \leq \cdots \leq X_{(n)} are the values arranged in increasing order.

  • X(1)=min{X1,,Xn}X_{(1)} = \min\{X_1, \ldots, X_n\} (minimum)
  • X(n)=max{X1,,Xn}X_{(n)} = \max\{X_1, \ldots, X_n\} (maximum)
  • X(k)X_{(k)} is the kk-th smallest value
Theorem

If X1,,XnX_1, \ldots, X_n are IID with PDF ff and CDF FF, then X(k)X_{(k)} has PDF: f(k)(x)=n!(k1)!(nk)!F(x)k1[1F(x)]nkf(x)f_{(k)}(x) = \frac{n!}{(k-1)!(n-k)!} F(x)^{k-1} [1-F(x)]^{n-k} f(x)

Example

For nn uniform random variables on [0,1][0,1]:

  • X(1)X_{(1)} (minimum) has PDF f(1)(x)=n(1x)n1f_{(1)}(x) = n(1-x)^{n-1}
  • X(n)X_{(n)} (maximum) has PDF f(n)(x)=nxn1f_{(n)}(x) = nx^{n-1}
  • The median X(n/2)X_{(\lceil n/2 \rceil)} has a more complex distribution

Box-Muller Transform

Theorem

If U1,U2Uniform(0,1)U_1, U_2 \sim \text{Uniform}(0,1) are independent, then: Z1=2lnU1cos(2πU2)Z_1 = \sqrt{-2\ln U_1} \cos(2\pi U_2) Z2=2lnU1sin(2πU2)Z_2 = \sqrt{-2\ln U_1} \sin(2\pi U_2)

are independent standard normal random variables: Z1,Z2N(0,1)Z_1, Z_2 \sim \mathcal{N}(0,1).

This elegant result provides a method to generate normal random variables from uniform ones.

Reliability and Hazard Functions

Definition

For a non-negative random variable XX (e.g., lifetime):

The reliability function (survival function) is: R(t)=P(X>t)=1FX(t)R(t) = P(X > t) = 1 - F_X(t)

The hazard function (failure rate) is: h(t)=fX(t)R(t)=fX(t)1FX(t)h(t) = \frac{f_X(t)}{R(t)} = \frac{f_X(t)}{1 - F_X(t)}

The hazard function represents the instantaneous failure rate given survival to time tt.

Example

For XExponential(λ)X \sim \text{Exponential}(\lambda): R(t)=eλt,h(t)=λeλteλt=λR(t) = e^{-\lambda t}, \quad h(t) = \frac{\lambda e^{-\lambda t}}{e^{-\lambda t}} = \lambda

The constant hazard rate reflects the memoryless property: the failure rate doesn't depend on age.

For the Weibull distribution with shape parameter β\beta:

  • β<1\beta < 1: decreasing failure rate (infant mortality)
  • β=1\beta = 1: constant failure rate (exponential distribution)
  • β>1\beta > 1: increasing failure rate (wear-out failures)
Remark

These transformations and relationships form a toolkit for both theoretical analysis and practical simulation. The probability integral transform enables Monte Carlo simulation, order statistics are central to non-parametric statistics, and hazard functions are fundamental in reliability engineering and survival analysis.