ConceptComplete

Joint and Conditional Distributions - Core Definitions

When analyzing multiple random variables simultaneously, joint distributions capture their collective behavior and dependencies.

Joint Distributions

Definition

For discrete random variables XX and YY, the joint PMF is: pX,Y(x,y)=P(X=x,Y=y)p_{X,Y}(x,y) = P(X = x, Y = y)

For continuous random variables, the joint PDF fX,Y(x,y)f_{X,Y}(x,y) satisfies: P((X,Y)∈A)=∬AfX,Y(x,y) dx dyP((X,Y) \in A) = \iint_A f_{X,Y}(x,y) \, dx \, dy

The joint CDF is: FX,Y(x,y)=P(X≀x,Y≀y)F_{X,Y}(x,y) = P(X \leq x, Y \leq y)

Normalization: βˆ‘xβˆ‘ypX,Y(x,y)=1\sum_x \sum_y p_{X,Y}(x,y) = 1 (discrete) or ∫∫fX,Y(x,y) dx dy=1\int\int f_{X,Y}(x,y) \, dx \, dy = 1 (continuous)

Example

Toss two dice. Let XX = first die, YY = second die.

Joint PMF: pX,Y(i,j)=1/36p_{X,Y}(i,j) = 1/36 for i,j∈{1,2,3,4,5,6}i,j \in \{1,2,3,4,5,6\}

Probability both dice show even: P(XΒ even,YΒ even)=βˆ‘i∈{2,4,6}βˆ‘j∈{2,4,6}136=936=14P(X \text{ even}, Y \text{ even}) = \sum_{i \in \{2,4,6\}} \sum_{j \in \{2,4,6\}} \frac{1}{36} = \frac{9}{36} = \frac{1}{4}

Marginal Distributions

Definition

The marginal distributions are obtained by summing/integrating over the other variable:

Discrete: pX(x)=βˆ‘ypX,Y(x,y)p_X(x) = \sum_y p_{X,Y}(x,y), pY(y)=βˆ‘xpX,Y(x,y)p_Y(y) = \sum_x p_{X,Y}(x,y)

Continuous: fX(x)=∫fX,Y(x,y) dyf_X(x) = \int f_{X,Y}(x,y) \, dy, fY(y)=∫fX,Y(x,y) dxf_Y(y) = \int f_{X,Y}(x,y) \, dx

The marginal distribution of XX ignores YY, giving the distribution of XX alone.

Example

Uniform on Triangle: (X,Y)(X,Y) uniform on triangle {(x,y):0≀x≀1,0≀y≀x}\{(x,y): 0 \leq x \leq 1, 0 \leq y \leq x\}.

Joint PDF: fX,Y(x,y)=2f_{X,Y}(x,y) = 2 for 0≀y≀x≀10 \leq y \leq x \leq 1 (area = 1/2, so density = 2)

Marginal of XX: fX(x)=∫0x2 dy=2x,0≀x≀1f_X(x) = \int_0^x 2 \, dy = 2x, \quad 0 \leq x \leq 1

Marginal of YY: fY(y)=∫y12 dx=2(1βˆ’y),0≀y≀1f_Y(y) = \int_y^1 2 \, dx = 2(1-y), \quad 0 \leq y \leq 1

Conditional Distributions

Definition

The conditional PMF of XX given Y=yY = y is: pX∣Y(x∣y)=pX,Y(x,y)pY(y),pY(y)>0p_{X|Y}(x|y) = \frac{p_{X,Y}(x,y)}{p_Y(y)}, \quad p_Y(y) > 0

The conditional PDF is: fX∣Y(x∣y)=fX,Y(x,y)fY(y),fY(y)>0f_{X|Y}(x|y) = \frac{f_{X,Y}(x,y)}{f_Y(y)}, \quad f_Y(y) > 0

Conditional distributions represent the distribution of XX given knowledge of Y=yY = y.

Example

For the triangle example: fX∣Y(x∣y)=fX,Y(x,y)fY(y)=22(1βˆ’y)=11βˆ’y,y≀x≀1f_{X|Y}(x|y) = \frac{f_{X,Y}(x,y)}{f_Y(y)} = \frac{2}{2(1-y)} = \frac{1}{1-y}, \quad y \leq x \leq 1

Given Y=yY = y, XX is uniform on [y,1][y, 1]!

Remark

Joint distributions encode all information about multiple random variables. Marginals describe individual behavior, while conditionals describe how one variable behaves given knowledge of another. Together, these concepts form the foundation for multivariate probability theory.