← back to algorithms

Complexity and O-Notation

Nievergelt & Hinrichs · Ch. 1 · Algorithms and Data Structures

An algorithm's cost is not a single number. It is a growth rate: how the cost scales as the input grows. Big-O, big-Omega, and big-Theta classify these rates. They let you compare algorithms without benchmarking.

n cost log n n 2ⁿ

Big-O: upper bounds

f(n) is O(g(n)) if there exist constants c and n_0 such that f(n) ≤ c · g(n) for all n ≥ n_0. It says: f grows no faster than g.

Scheme

Big-Omega and Big-Theta

Big-Omega is the mirror: f(n) is Ω(g(n)) if f grows at least as fast as g. Big-Theta is the sandwich: f(n) is Θ(g(n)) when both O and Ω hold. It means f and g grow at exactly the same rate.

Scheme

Comparing growth rates

The hierarchy: O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(2ⁿ). An algorithm in a lower class will always win for large enough n, regardless of constant factors.

Scheme
Neighbors

Cross-references