Probability Spaces and Events - Main Theorem
Bayes' theorem is the cornerstone of statistical inference, providing a systematic method for updating beliefs in light of new evidence. It connects forward probabilities (from causes to effects) with inverse probabilities (from effects to causes).
Bayes' Theorem
Let be a probability space, and let be a partition of with for all . For any event with :
Proof Outline: Starting from the definition of conditional probability:
Applying the law of total probability to the denominator:
Substitution yields the result. □
Interpretation and Terminology
In Bayesian terminology:
- is the prior probability of hypothesis (before observing data)
- is the likelihood of observing data given hypothesis
- is the posterior probability of hypothesis (after observing data)
- is the marginal likelihood or evidence
The theorem states: Posterior is proportional to likelihood times prior.
Medical Diagnosis: A rare disease affects 0.1% of the population. A diagnostic test has 99% sensitivity (detects disease when present) and 95% specificity (correct when disease absent).
Let = "has disease", = "tests positive". Given:
- Prior: ,
- Likelihood: ,
By Bayes' theorem:
Despite the test's high accuracy, a positive result only indicates 2% probability of disease due to the low base rate (prior).
Continuous Version
For continuous random variables with densities, Bayes' theorem takes the form:
This form is fundamental in Bayesian statistics, where we update our beliefs about parameters based on observed data .
Bayes' theorem is not controversial—it follows directly from the definition of conditional probability. The philosophical debate in statistics centers on whether we should assign probability distributions to parameters (Bayesian approach) or treat them as fixed unknowns (frequentist approach).