← back to algorithms

Recursion

Nievergelt & Hinrichs · Ch. 2 · Algorithms and Data Structures

A recursive algorithm solves a problem by solving smaller instances of the same problem. Divide and conquer splits the input, solves each part, and combines the results. The master theorem tells you the cost without unrolling the recurrence.

[5, 3, 8, 1, 7, 2, 6, 4] [5, 3, 8, 1] [7, 2, 6, 4] [5, 3] [8, 1] [7, 2] [6, 4] merge up: log n levels, n work per level = O(n log n) [1, 2, 3, 4, 5, 6, 7, 8]

Recursive thinking

Every recursive function needs a base case (when to stop) and a recursive case (how to reduce). The recursion tree shows all the calls. Its depth is the call stack. Its total work is the running time.

Scheme

Divide and conquer

Split the problem into a subproblems of size n/b, solve each recursively, combine in O(n^d) time. The recurrence is T(n) = a T(n/b) + O(n^d).

Scheme

The master theorem

For T(n) = a T(n/b) + O(n^d): if d > log_b(a), then T(n) = O(n^d). If d = log_b(a), then T(n) = O(n^d log n). If d < log_b(a), then T(n) = O(n^(log_b a)). Merge sort: a=2, b=2, d=1. Since d = log_2(2) = 1, we get O(n log n).

Scheme
Neighbors

Cross-references