Growth of divide-and-conquer recursion: master theorem (CSCI 2824, Spring 2015)

In this lecture we introduce the divide-and-conquer recursions, and the master theorem for estimating the growth of divide-and-conquer recursions.

(Section 4.8 of the textbook)

A divide-and-conquer recursion is a recursive sequence {T(n)}_n of the form

T(n) = Lcdot Tleft(leftlceil frac{n}{K}rightrceilright) +g(n), T(1)= some positive constant,

where Lgeq1, K>1 and g:mathbb{N}tomathbb{N}. Loosely speaking, a divide-and-conquer recursion captures the number of operations involved by a divide-and-conquer algorithm applied on a specific problem. As a recursive algorithm, divide-and-conquer proceeds by repeatedly breaking the current problem down into smaller subproblems of the same nature, until it is small enough that the solution is “trivial”.

Simple examples of divide-and-conquer include binary search and merge sort.

Each specialized divide-and-conquer algorithm would have the following few features:

  • the work that needs to be done to split the problem into smaller subproblems of the same nature — the amount of such work is captured by the function g;

  • the number Lgeq1 of subproblems that need to be solved after each break-down;

  • the size of each subproblem, usually given by leftlceil frac{n}{K} rightrceil or leftlfloor frac{n}{K} rightrfloor. (It is very common to drop the floor or ceiling in the expression for the recursion T(cdot), by abuse of notation.)

With the roles of L,K, g in mind, it is not difficult to understand the intuition behind the divide-and-conquer recusion. For more information on divide-and-conquer algorithms, see e.g. the Wikibook chapter.

What we are interested in is the growth of divide-and-conquer recursions: the growth of such a recursion tells us the efficiency of the corresponding divide and conquer algorithm. It turns out that the growth of T(n) depends on whether the function g(n) grows faster, at the same rate or slower than the polynomial function n^q, where q=log_K(L).

Master theorem

Consider the recursive function T:mathbb{N}tomathbb{R} defined by

T(n) = Lcdot Tleft(leftlceil frac{n}{K}rightrceilright) +g(n), T(1)= some positive constant,

with Lgeq1, K>1 and g:mathbb{N}tomathbb{N}.

Define qtriangleq log_K(L). Then the following is true.

  • Case 1: If gin O(n^p) for some 0<p<q, then Tin Theta(n^q).

  • Case 2: If ginTheta(n^q), then Tin Theta(n^qlog(n)).

  • Case 3: If gin Omega(n^r) for some r>q, then TinTheta(g).

Example 1

We estimate the growth rate of the recursive function T(n)=2Tleft(leftlceil frac{n}{2}rightrceilright) + n.

  • Take L=2, K=2, g(n)=n.

  • Then qtriangleq log_K(L)=log_2(2)=1.

  • Note that g(n)=nin Theta(n^q).

  • Hence we end up with Case 2, i.e., Tin Theta(nlog(n)).

Example 2

We estimate the growth rate of the recursive function T(n)=2Tleft(leftlceil frac{n}{2}rightrceilright) + n^2.

  • Take L=2, K=2, g(n)=n^2.

  • Then qtriangleq log_K(L)=log_2(2)=1.

  • Note that g(n)=n^2in Omega(n^2).

  • Hence we end up with Case 3, i.e., Tin Theta(n^2).

Example 3

We estimate the growth rate of the recursive function T(n)=2Tleft(leftlceil frac{n}{2}rightrceilright) + log(n).

  • Take L=2, K=2, g(n)=n.

  • Then qtriangleq log_K(L)=log_2(2)=1.

  • Note that g(n)=log(n) grows slower than n^q=n.

  • Hence we end up with Case 1, i.e., Tin Theta(n).

Example 4

We estimate the growth rate of the recursive function T(n)=8Tleft(leftlceil frac{n}{2}rightrceilright) + n.

  • Take L=8, K=2, g(n)=1000n^2.

  • Then qtriangleq log_K(L)=log_2(8)=3.

  • Note that g(n)=1000n^2in O(n^2), so g grows slower than n^q=n^3.

  • Hence we end up with Case 1, i.e., Tin Theta(n^3).