ConceptComplete

Joint and Conditional Distributions - Examples and Constructions

Joint distributions model real-world scenarios where multiple measurements or observations occur simultaneously.

Multinomial Distribution

Definition

Extension of binomial to kk categories. If nn independent trials result in category ii with probability pip_i (where βˆ‘pi=1\sum p_i = 1), the counts (X1,…,Xk)(X_1, \ldots, X_k) follow: P(X1=n1,…,Xk=nk)=n!n1!β‹―nk!p1n1β‹―pknkP(X_1=n_1, \ldots, X_k=n_k) = \frac{n!}{n_1! \cdots n_k!} p_1^{n_1} \cdots p_k^{n_k}

where βˆ‘ni=n\sum n_i = n.

Example

Roll a die 12 times. Let XiX_i = count of outcome ii.

Probability of getting each face exactly twice: P(X1=2,…,X6=2)=12!(2!)6(16)12β‰ˆ0.00344P(X_1=2, \ldots, X_6=2) = \frac{12!}{(2!)^6} \left(\frac{1}{6}\right)^{12} \approx 0.00344

Bivariate Transformations

For U=g1(X,Y)U = g_1(X,Y) and V=g2(X,Y)V = g_2(X,Y) with invertible transformation: fU,V(u,v)=fX,Y(x,y)∣J∣f_{U,V}(u,v) = f_{X,Y}(x,y) |J|

where JJ is the Jacobian determinant: J=βˆ£βˆ‚xβˆ‚uβˆ‚xβˆ‚vβˆ‚yβˆ‚uβˆ‚yβˆ‚v∣J = \begin{vmatrix} \frac{\partial x}{\partial u} & \frac{\partial x}{\partial v} \\ \frac{\partial y}{\partial u} & \frac{\partial y}{\partial v} \end{vmatrix}

Example

Polar Coordinates: X,Y∼N(0,1)X, Y \sim \mathcal{N}(0,1) independent.

Transform to (R,Θ)(R, \Theta) where X=Rcos⁑Θ,Y=Rsin⁑ΘX = R\cos\Theta, Y = R\sin\Theta: fR,Θ(r,ΞΈ)=fX,Y(rcos⁑θ,rsin⁑θ)β‹…rf_{R,\Theta}(r, \theta) = f_{X,Y}(r\cos\theta, r\sin\theta) \cdot r

Since fX,Y(x,y)=12Ο€eβˆ’(x2+y2)/2f_{X,Y}(x,y) = \frac{1}{2\pi}e^{-(x^2+y^2)/2} and x2+y2=r2x^2 + y^2 = r^2: fR,Θ(r,ΞΈ)=r2Ο€eβˆ’r2/2f_{R,\Theta}(r,\theta) = \frac{r}{2\pi}e^{-r^2/2}

Factoring: RR and Θ\Theta are independent! R∼RayleighR \sim \text{Rayleigh}, Θ∼Uniform(0,2Ο€)\Theta \sim \text{Uniform}(0,2\pi).

Order Statistics

For IID sample X1,…,XnX_1, \ldots, X_n, the joint PDF of order statistics (X(1),…,X(n))(X_{(1)}, \ldots, X_{(n)}) is: f(1),…,(n)(x1,…,xn)=n!∏i=1nf(xi)f_{(1),\ldots,(n)}(x_1, \ldots, x_n) = n! \prod_{i=1}^n f(x_i)

for x1<x2<β‹―<xnx_1 < x_2 < \cdots < x_n.

Example

For n=3n=3 uniform(0,1)(0,1) variables: f(1),(2),(3)(x,y,z)=6Β forΒ 0<x<y<z<1f_{(1),(2),(3)}(x,y,z) = 6 \text{ for } 0 < x < y < z < 1

Joint density is constant over the simplex!

Copulas

Definition

A copula is a joint distribution function on [0,1]2[0,1]^2 with uniform marginals. It separates dependence structure from marginal distributions.

For any joint CDF FX,YF_{X,Y}: FX,Y(x,y)=C(FX(x),FY(y))F_{X,Y}(x,y) = C(F_X(x), F_Y(y))

where CC is the copula function.

Common copulas: Gaussian, t-, Archimedean (Clayton, Gumbel, Frank).

Remark

Copulas enable flexible modeling by separating "what" variables we're modeling (marginals) from "how" they're related (dependence structure). This is powerful in finance for modeling joint tail risk.