Joint and Conditional Distributions - Core Definitions
When analyzing multiple random variables simultaneously, joint distributions capture their collective behavior and dependencies.
Joint Distributions
For discrete random variables and , the joint PMF is:
For continuous random variables, the joint PDF satisfies:
The joint CDF is:
Normalization: (discrete) or (continuous)
Toss two dice. Let = first die, = second die.
Joint PMF: for
Probability both dice show even:
Marginal Distributions
The marginal distributions are obtained by summing/integrating over the other variable:
Discrete: ,
Continuous: ,
The marginal distribution of ignores , giving the distribution of alone.
Uniform on Triangle: uniform on triangle .
Joint PDF: for (area = 1/2, so density = 2)
Marginal of :
Marginal of :
Conditional Distributions
The conditional PMF of given is:
The conditional PDF is:
Conditional distributions represent the distribution of given knowledge of .
For the triangle example:
Given , is uniform on !
Joint distributions encode all information about multiple random variables. Marginals describe individual behavior, while conditionals describe how one variable behaves given knowledge of another. Together, these concepts form the foundation for multivariate probability theory.