Conditional Probability
Grinstead & Snell ยท GFDL ยท PDF
Conditional probability is the probability of A given that B has occurred: P(A|B) = P(A ∩ B) / P(B). This is how you update beliefs with new information. Bayes' theorem flips the direction: knowing P(B|A), compute P(A|B).
P(A|B) โ restricting the sample space
When you learn that B has occurred, the sample space shrinks from Ω to B. The conditional probability P(A|B) re-normalizes by dividing by P(B). Only outcomes in both A and B survive.
The multiplication rule
Rearranging the definition: P(A ∩ B) = P(A|B) · P(B). For a chain of events: P(A ∩ B ∩ C) = P(A) · P(B|A) · P(C|A ∩ B). Multiply along the branches of the tree.
Bayes' theorem
P(A|B) = P(B|A) · P(A) / P(B). You know how likely the evidence is given the hypothesis (P(B|A)). Bayes lets you compute how likely the hypothesis is given the evidence (P(A|B)). The denominator P(B) is often expanded using the law of total probability.
Independence
Events A and B are independent if P(A|B) = P(A), meaning B gives no information about A. Equivalently, P(A ∩ B) = P(A) · P(B). Coin flips are independent. Card draws without replacement are not.
Notation reference
| Textbook | Scheme | Meaning |
|---|---|---|
| P(A|B) | (cond-prob p-ab p-b) | Conditional probability |
| P(A ∩ B) = P(A|B)P(B) | (* p-a-given-b p-b) | Multiplication rule |
| P(A|B) = P(B|A)P(A)/P(B) | (/ (* p-ba p-a) p-b) | Bayes' theorem |
| A ⊥ B | (independent? ...) | Independence |
| P(B) = ∑ P(B|A_i)P(A_i) | (+ (* p-b-a1 p-a1) ...) | Law of total probability |
Neighbors
Adjacent chapters
- ๐ฐ Ch 3: Combinatorics โ counting makes conditional probabilities computable
- ๐ฐ Ch 1: Discrete Probability โ the axioms that conditional probability builds on
- ๐ฐ Ch 2: Continuous Probability โ conditioning works for densities too (condition on events with positive measure)
- ๐ง Cognitive Science Ch.2 — Bayes' theorem as a model of human reasoning
- ๐ค ML Ch.4 — logistic regression uses conditional probability directly
- ๐ Statistics Ch.3 — conditional probability in the context of inference
Paper pages
- ๐ Fritz 2020 โ Markov categories axiomatize conditional probability as morphism composition. Conditioning is a morphism, not a derived operation
- ๐ Baez-Fritz 2011 โ entropy measures how much conditioning reduces uncertainty
- ๐ Staton 2025 โ probabilistic programs implement conditioning via observe statements
Related foundations
- ๐ง Lovelace Ch.2 Bayesian Inference โ Bayes' theorem applied to cognitive science
Foundations (Wikipedia)