A recursive algorithm solves a problem by solving smaller instances of the same problem. Divide and conquer splits the input, solves each part, and combines the results. The master theorem tells you the cost without unrolling the recurrence.
Recursive thinking
Every recursive function needs a base case (when to stop) and a recursive case (how to reduce). The recursion tree shows all the calls. Its depth is the call stack. Its total work is the running time.
Scheme
; Factorial: the simplest recursion; Base case: 0! = 1; Recursive case: n! = n * (n-1)!
(define (factorial n)
(if (= n 0)
1
(* n (factorial (- n 1)))))
(display "5! = ") (display (factorial 5)) (newline)
(display "10! = ") (display (factorial 10))
Python
# Factorial: recursive and iterativedef factorial_rec(n):
if n == 0: return1return n * factorial_rec(n - 1)
def factorial_iter(n):
result = 1for i inrange(1, n + 1):
result *= i
return result
print(f"5! = {factorial_rec(5)}")
print(f"10! = {factorial_iter(10)}")
Divide and conquer
Split the problem into a subproblems of size n/b, solve each recursively, combine in O(n^d) time. The recurrence is T(n) = a T(n/b) + O(n^d).
Scheme
; Merge sort: the canonical divide-and-conquer; Split in half, sort each half, merge.
(define (merge-sort lst)
(if (or (null? lst) (null? (cdr lst)))
lst
(let* ((mid (quotient (length lst) 2))
(left (take lst mid))
(right (drop lst mid)))
(merge-lists (merge-sort left)
(merge-sort right)))))
(define (take lst n)
(if (= n 0) (list)
(cons (car lst) (take (cdr lst) (- n 1)))))
(define (drop lst n)
(if (= n 0) lst
(drop (cdr lst) (- n 1))))
(define (merge-lists a b)
(cond ((null? a) b)
((null? b) a)
((<= (car a) (car b))
(cons (car a) (merge-lists (cdr a) b)))
(else
(cons (car b) (merge-lists a (cdr b))))))
(display (merge-sort (list 53817264)))
The master theorem
For T(n) = a T(n/b) + O(n^d): if d > log_b(a), then T(n) = O(n^d). If d = log_b(a), then T(n) = O(n^d log n). If d < log_b(a), then T(n) = O(n^(log_b a)). Merge sort: a=2, b=2, d=1. Since d = log_2(2) = 1, we get O(n log n).
Scheme
; Master theorem cases for T(n) = a*T(n/b) + O(n^d); Case 1: d > log_b(a) => O(n^d); Case 2: d = log_b(a) => O(n^d * log n); Case 3: d < log_b(a) => O(n^(log_b a)); Merge sort: a=2, b=2, d=1
(define a 2)
(define b 2)
(define d 1)
(define log-b-a (/ (log a) (log b)))
(display "a=2, b=2, d=1") (newline)
(display "log_b(a) = ") (display log-b-a) (newline)
(display "d = log_b(a), so Case 2: O(n^1 * log n) = O(n log n)") (newline)
(newline)
; Binary search: a=1, b=2, d=0
(display "Binary search: a=1, b=2, d=0") (newline)
(display "log_2(1) = 0 = d, Case 2: O(log n)")
Neighbors
Cross-references
🔢 Discrete Math Ch.2 — induction: the proof technique behind recursive correctness
🐱 Milewski Ch.19 — F-algebras and catamorphisms: recursion as a universal construction