← back to probability

Sums of Random Variables

Grinstead & Snell · GFDL · PDF

When you add independent random variables, means add, variances add, and the distribution of the sum is the convolution of the individual distributions. The sum gets wider and more bell-shaped.

Convolution

If X and Y are independent discrete random variables, the PMF of Z = X + Y is the convolution: P(Z = z) = sum over k of P(X = k) P(Y = z - k). Each possible way the parts can add to z contributes to the total. This is why adding dice produces the familiar triangle-shaped distribution.

X + Y = X + Y narrow narrow wider, shifted Convolution: sum is wider and more centered
Scheme

Means add

E[X + Y] = E[X] + E[Y]. This is linearity of expectation from Ch 6, and it holds whether or not X and Y are independent. Roll two dice: E[X + Y] = 3.5 + 3.5 = 7. Roll a hundred dice: E[sum] = 350.

Scheme

Variances add when independent

Var(X + Y) = Var(X) + Var(Y), but only when X and Y are independent. If they are correlated, a covariance term appears: Var(X + Y) = Var(X) + Var(Y) + 2 Cov(X,Y). For independent variables, covariance is zero, so variances add cleanly.

Scheme

Verify by simulation

Theory says: means add, variances add. Simulation confirms it. Roll two dice many times, compute the sample mean and variance of the sum, and check that they match the formulas.

Scheme

Notation reference

Notation Scheme Meaning
P(Z = z) = ∑ P(X=k)P(Y=z-k)(convolve pmf-x pmf-y ...)Convolution of PMFs
E[X+Y] = E[X]+E[Y](+ mu-x mu-y)Means always add
Var(X+Y) = Var(X)+Var(Y)(+ var-x var-y)Variances add (if independent)
Cov(X,Y)E[(X-μx)(Y-μy)]Covariance: dependence correction
Neighbors

Probability chapters

  • 🎰 Ch 6 — expected value and variance (prerequisites for this chapter)
  • 🎰 Ch 5 — the named distributions that get convolved here
  • 🎰 Ch 8 — law of large numbers (why sums of many variables become predictable)

Foundations (Wikipedia)

Ready for the real thing? Read Grinstead & Snell, Chapter 7.

jkThe Handshake