Recursion & The Master Theorem
Many divide-and-conquer algorithms have a running time expressed as a recurrence relation. The Master Theorem provides a "cookbook" method for solving recurrence relations of a specific form, allowing us to determine their time complexity without drawing out a full recurrence tree.
The theorem applies to recurrences of the form:
The number of subproblems.
The factor by which the input size is reduced.
The cost of the work done outside the recursive calls (dividing and combining).
Visualizing the Recurrence Tree
The Three Cases
To use the theorem, we compare `f(n)` with `n^(log_b a)`. The larger of the two functions determines the solution.
Case 1: `f(n)` is polynomially smaller
If `f(n) = O(n^(log_b a - ε))` for some constant `ε > 0`, then the work at the leaves dominates.
T(n) = Θ(n^(log_b a))
Case 2: `f(n)` is comparable
If `f(n) = Θ(n^(log_b a) * log^k n)` for some constant `k ≥ 0`, then the work is distributed evenly across the tree.
T(n) = Θ(n^(log_b a) * log^(k+1) n)
Case 3: `f(n)` is polynomially larger
If `f(n) = Ω(n^(log_b a + ε))` for some `ε > 0`, and if the "regularity condition" `a * f(n/b) ≤ c * f(n)` holds for some constant `c < 1` and sufficiently large `n`, then the work at the root dominates.
T(n) = Θ(f(n))
Practice Tool
Enter the components of your recurrence `T(n) = aT(n/b) + f(n)` to solve it.