Expectation and Variance - Core Definitions
Expectation and variance are the two most important numerical summaries of a distribution, characterizing its center and spread respectively.
Expectation (Expected Value)
The expectation (or expected value or mean) of a random variable is:
Discrete case:
Continuous case:
provided the sum or integral converges absolutely.
The expectation represents the "average" value of over many independent repetitions. It is also denoted or .
Fair Die: with :
Exponential Distribution: with for :
Properties of Expectation
Linearity: For any constants and random variables :
The linearity holds even if and are dependentβno independence assumption needed!
Non-negativity: If almost surely, then .
Monotonicity: If almost surely, then .
Variance
The variance of measures the spread of the distribution:
The standard deviation is .
The variance is always non-negative: , with equality if and only if is constant almost surely.
For a fair die:
Properties of Variance
Shift invariance: (adding a constant doesn't change spread)
Scaling:
Independence: If and are independent:
Expectation and variance are the first two moments of a distribution. Higher moments (skewness, kurtosis) provide additional shape information, but mean and variance suffice for many applications, especially with normal distributions which are completely determined by these two parameters.