Ph.D. Qual Study Assistant

Home

Recursion & The Master Theorem

Many divide-and-conquer algorithms have a running time expressed as a recurrence relation. The Master Theorem provides a "cookbook" method for solving recurrence relations of a specific form, allowing us to determine their time complexity without drawing out a full recurrence tree.

The theorem applies to recurrences of the form:

T(n) = aT(n/b) + f(n)
a ≥ 1

The number of subproblems.

b > 1

The factor by which the input size is reduced.

f(n)

The cost of the work done outside the recursive calls (dividing and combining).

Visualizing the Recurrence Tree

T(n) Work at root: f(n) T(n/b) ... 'a' subproblems ... T(n/b) Work at level 1: a * f(n/b) ... ... ... Total leaves (base cases): alogb n = nlogb a height = log_b(n)

The Three Cases

To use the theorem, we compare `f(n)` with `n^(log_b a)`. The larger of the two functions determines the solution.

Case 1: `f(n)` is polynomially smaller

If `f(n) = O(n^(log_b a - ε))` for some constant `ε > 0`, then the work at the leaves dominates.

T(n) = Θ(n^(log_b a))

Case 2: `f(n)` is comparable

If `f(n) = Θ(n^(log_b a) * log^k n)` for some constant `k ≥ 0`, then the work is distributed evenly across the tree.

T(n) = Θ(n^(log_b a) * log^(k+1) n)

Case 3: `f(n)` is polynomially larger

If `f(n) = Ω(n^(log_b a + ε))` for some `ε > 0`, and if the "regularity condition" `a * f(n/b) ≤ c * f(n)` holds for some constant `c < 1` and sufficiently large `n`, then the work at the root dominates.

T(n) = Θ(f(n))

Practice Tool

Enter the components of your recurrence `T(n) = aT(n/b) + f(n)` to solve it.