🎰 Introduction to Probability
Based on Grinstead & Snell's Introduction to Probability, licensed GFDL.
If a paper page uses distributions, expectations, or Markov chains and you want the ground-level definitions, start here.
| Chapter | |||
|---|---|---|---|
| 1. | Discrete Probability | Sample spaces, events, and assigning probabilities to outcomes you can count | 🎰 |
| 2. | Continuous Probability | When outcomes are real numbers, probabilities are areas under curves | 🎰 |
| 3. | Combinatorics | Counting arrangements: permutations, combinations, binomial coefficients | 🎰 |
| 4. | Conditional Probability | P(A|B) = P(A and B) / P(B), and why Bayes' theorem follows | 🎰 |
| 5. | Distributions | Binomial, Poisson, geometric, normal: the named distributions and when they arise | 🎰 |
| 6. | Expected Value | The weighted average of outcomes, and variance as spread around it | 🎰 |
| 7. | Sums of Random Variables | Add independent random variables: means add, variances add, distributions convolve | 🎰 |
| 8. | Law of Large Numbers | Averages converge to the expected value as sample size grows | 🎰 |
| 9. | Central Limit Theorem | Sums of many independent variables approach a normal distribution | 🎰 |
| 10. | Generating Functions | Encode a distribution as a polynomial, then multiply polynomials to convolve | 🎰 |
| 11. | Markov Chains | Memoryless state machines: the next state depends only on the current one | 🎰 |
| 12. | Random Walks | Step left or right with equal probability: will you return to the origin? | 🎰 |
📺 Video lectures: MIT 6.041SC Probabilistic Systems Analysis
Neighbors
- 📊 Statistics — probability distributions reappear as inference tools
- 📡 Information Theory — entropy is expected surprise over a probability distribution
- 🤖 Machine Learning — Bayesian models, Gaussian processes, and sampling
- 🎲 Game Theory — mixed strategies are probability distributions over actions