← back to probability

Generating Functions

Grinstead & Snell · GFDL · PDF

Encode a distribution as a polynomial: g(z) = ∑ P(X = k) zᵏ. Adding independent random variables becomes multiplying their polynomials. Convolution is just polynomial multiplication.

Ordinary generating function

For a discrete random variable X taking values 0, 1, 2, …, the ordinary generating function (OGF) is g(z) = E[zᴺ] = ∑ₖ P(X = k) zᵏ. The coefficients are the probabilities. Evaluating g(1) = 1 (probabilities sum to 1). The derivative g′(1) = E[X].

g₁(z) = ½ + ½z coin flip × g₂(z) = ½ + ½z coin flip = ¼ + ½z + ¼z² sum of 2 flips read off coefficients: P(S=0) = ¼, P(S=1) = ½, P(S=2) = ¼ convolution of distributions = multiplication of polynomials
Scheme

Moments from derivatives

The generating function encodes all moments. E[X] = g′(1). E[X(X−1)] = g″(1), so Var(X) = g″(1) + g′(1) − (g′(1))². No summation needed: just differentiate the polynomial and evaluate at z = 1.

Scheme

Moment generating function

The moment generating function (MGF) is M(t) = E[eᵗᴺ]. It is the OGF evaluated at z = eᵗ. Its advantage: M′(0) = E[X], M″(0) = E[X²], and in general M⁽ᵏ⁾(0) = E[Xᵏ]. The MGF of a sum of independent variables is the product of their MGFs, exactly like the OGF.

Scheme

Notation reference

Textbook Scheme Meaning
g(z) = ∑ pₖ zᵏ'(p0 p1 p2 ...)Generating function as coefficient list
g₁ · g₂(poly-mul g1 g2)Convolution = polynomial multiplication
g′(1)(poly-eval (poly-deriv g) 1)Expected value
M(t) = E[eᵗᴺ](mgf-die t)Moment generating function
Neighbors

Probability chapters

  • 🎰 Ch 9 — Central Limit Theorem (the CLT proof uses MGFs)
  • 🎰 Ch 7 — Sums of Random Variables (convolution the hard way)
  • 🎰 Ch 5 — Distributions (the named distributions encoded as polynomials)

Connections

Translation notes

Polynomials are stored as coefficient lists indexed from 0. The polynomial multiplication is exactly discrete convolution. For continuous distributions, the OGF becomes an integral transform, but the algebraic structure is the same: independence of random variables corresponds to multiplication of their generating functions. The textbook uses this machinery to prove the CLT: take the log of the MGF, expand to second order, and recover the normal MGF.

Ready for the real thing? Read Chapter 10 of Grinstead & Snell.

jkThe Handshake