Entropy and Diversity
Tom Leinster ยท 2021 ยท Cambridge University Press
Prereqs: basic probability (what a distribution is). ๐ Baez, Fritz, Leinster 2011 provides the entropy characterization.
Hill diversity numbers form a one-parameter family: at q=0 you count species, at q=1 you get the exponential of Shannon entropy, at q=2 you get Simpson's reciprocal. Magnitude extends this to metric spaces โ a single number measuring the "effective size" of a space.
Hill diversity numbers
Given a probability distribution p over n species, the Hill number of order q is:
This is the "effective number of species." A community with 10 equally-common species has D_q = 10 for all q. A community dominated by one species has D_q close to 1.
q=0: species richness
At q=0, all nonzero probabilities contribute equally. Dโ just counts the number of species present. Rare and common species count the same.
q=1: exponential of Shannon entropy
At q=1 (the limit), Dโ = exp(H), where H is Shannon entropy. This is the "effective number of equally-common species" โ entropy converted from bits to a count.
q=2: Simpson's reciprocal
At q=2, Dโ = 1/ฮฃpแตขยฒ. This is the reciprocal of the probability that two randomly chosen individuals belong to the same species. Dominant species are heavily weighted.
Magnitude โ effective size of a metric space
Magnitude generalizes diversity to metric spaces. Given a finite metric space with distance matrix d, the magnitude is the sum of the weight vector w satisfying Zw = 1, where Z_ij = e^(-d_ij). It measures the "effective number of points" โ points close together contribute less than points far apart.
Notation reference
| Paper | Scheme | Meaning |
|---|---|---|
| แตD(p) | (hill-number p q) | Hill diversity of order q |
| H(p) | (shannon-entropy p) | Shannon entropy |
| |A| | ; magnitude | Magnitude (effective size) |
| Z_ij = e^(-d_ij) | (exp (- d)) | Similarity matrix entry |
| w | ; weight vector, Zw = 1 | Magnitude weights |
Neighbors
Other paper pages
- ๐ Baez, Fritz, Leinster 2011 โ characterizes Shannon entropy via functoriality
- ๐ Chen, Vigneaux 2023 โ unifies magnitude and entropy categorically
- ๐ Sato 2023 โ divergences on monads (related information measures)
Related foundations
- โ Lebl Ch.8 Metric Spaces โ the metric space structure that magnitude measures
Foundations (Wikipedia)
Translation notes
The Hill number computations are exact for finite distributions. The magnitude example is conceptual โ solving the linear system Zw = 1 for arbitrary metric spaces requires matrix inversion, which the examples only sketch. For example: the magnitude discussion describes 3 points on a line. In the book, magnitude is defined for arbitrary enriched categories via the Euler characteristic of a category โ a construction that subsumes metric spaces, posets, and graphs under one definition. The Hill numbers are the same; the categorical generalization of magnitude is not.
Hill number examples: Exact. Magnitude example: Analogy.
Framework connection: Hill diversity and magnitude measure the Natural Framework's information budget โ exp(entropy) is the effective number of distinguishable states at each pipeline stage. (
The Natural Framework)