TheoremComplete

Random Variables and Distributions - Main Theorem

The Law of the Unconscious Statistician provides a fundamental method for computing expectations of transformed random variables without explicitly finding the distribution of the transformation.

Law of the Unconscious Statistician (LOTUS)

Theorem

Let XX be a random variable and g:R→Rg: \mathbb{R} \to \mathbb{R} a function. Then:

Discrete case: If XX has PMF pXp_X: E[g(X)]=βˆ‘xg(x)pX(x)E[g(X)] = \sum_x g(x) p_X(x)

Continuous case: If XX has PDF fXf_X: E[g(X)]=βˆ«βˆ’βˆžβˆžg(x)fX(x) dxE[g(X)] = \int_{-\infty}^{\infty} g(x) f_X(x) \, dx

provided the sum or integral converges absolutely.

The theorem states that we can compute E[g(X)]E[g(X)] directly from the distribution of XX without first deriving the distribution of Y=g(X)Y = g(X).

Proof Sketch (Continuous Case): Let Y=g(X)Y = g(X). By definition: E[Y]=βˆ«βˆ’βˆžβˆžyfY(y) dyE[Y] = \int_{-\infty}^{\infty} y f_Y(y) \, dy

Through change of variables and the transformation formula for PDFs, this equals: βˆ«βˆ’βˆžβˆžg(x)fX(x) dx\int_{-\infty}^{\infty} g(x) f_X(x) \, dx β–‘

Example

Let X∼N(0,1)X \sim \mathcal{N}(0,1) (standard normal). Find E[X2]E[X^2].

Using LOTUS: E[X2]=βˆ«βˆ’βˆžβˆžx2β‹…12Ο€eβˆ’x2/2 dxE[X^2] = \int_{-\infty}^{\infty} x^2 \cdot \frac{1}{\sqrt{2\pi}} e^{-x^2/2} \, dx

By integration by parts or recognizing this as the variance of a standard normal: E[X2]=1E[X^2] = 1

Without LOTUS, we would need to first find the distribution of Y=X2Y = X^2 (a chi-squared distribution), then compute its expectationβ€”much more work!

Applications of LOTUS

Computing Moments: For the kk-th moment of XX:

\sum_x x^k p_X(x) & \text{(discrete)} \\ \int_{-\infty}^{\infty} x^k f_X(x) \, dx & \text{(continuous)} \end{cases}$$ **Variance**: Using $\text{Var}(X) = E[X^2] - (E[X])^2$: $$\text{Var}(X) = E[X^2] - \mu^2 = \int (x-\mu)^2 f_X(x) \, dx$$ <Example> For $X \sim \text{Exponential}(\lambda)$ with $f_X(x) = \lambda e^{-\lambda x}$ for $x \geq 0$: $$E[X] = \int_0^{\infty} x \cdot \lambda e^{-\lambda x} \, dx = \frac{1}{\lambda}$$ $$E[X^2] = \int_0^{\infty} x^2 \cdot \lambda e^{-\lambda x} \, dx = \frac{2}{\lambda^2}$$ $$\text{Var}(X) = \frac{2}{\lambda^2} - \frac{1}{\lambda^2} = \frac{1}{\lambda^2}$$ </Example> ## Moment Generating Function <Definition> The **moment generating function** (MGF) of $X$ is: $$M_X(t) = E[e^{tX}]$$ when this expectation exists for $t$ in some neighborhood of 0. </Definition> By LOTUS: $$M_X(t) = \begin{cases} \sum_x e^{tx} p_X(x) & \text{(discrete)} \\ \int_{-\infty}^{\infty} e^{tx} f_X(x) \, dx & \text{(continuous)} \end{cases}$$ The MGF "generates moments": $E[X^k] = M_X^{(k)}(0)$ (the $k$-th derivative at 0). <Example> For $X \sim \mathcal{N}(\mu, \sigma^2)$: $$M_X(t) = e^{\mu t + \sigma^2 t^2/2}$$ Taking derivatives: - $E[X] = M_X'(0) = \mu$ βœ“ - $E[X^2] = M_X''(0) = \mu^2 + \sigma^2$, so $\text{Var}(X) = \sigma^2$ βœ“ </Example> <Remark> LOTUS is one of the most frequently used results in probability. Its name derives from the fact that early statisticians would apply it "unconsciously"β€”computing $E[g(X)]$ without realizing they were using a theorem. The MGF is particularly powerful because it uniquely determines a distribution and simplifies calculations involving sums of independent random variables. </Remark>