From charlesreid1

Statement of Master Theorem

The Master Theorem is used to analyze three distinct cases for divide-and-conquer recurrences:

Case 1: Subproblems Dominate (Leaf-Heavy)

If f(n) = O(n^{ \log_{b}{(a - \epsilon)} }) for some constant \epsilon > 0 then T(n) = \Theta( n^{\log_b{a}} )

Case 2: Split/Recombine ~ Subproblems

If f(n) = \Theta( n^{\log_b{a}} \log^k{(x)} ), then T(n) = \Theta( n^{\log_b{a}} \log^{k+1} {n} ) (note that usually k=0)

Case 3: Split/Recombine Dominates (Root Heavy)

If f(n) = \Omega( n^{\log_b{(a+\epsilon)}} ) for some constant \epsilon > 0 , and if a f(n/b) \leq cf(n) for some c < 1, then T(n) = \Theta( f(n) )

Revisiting Big O, Big Theta, Big Omega

Big O

When we say

f(n) = O(g(n))

what we mean is, there are some constants c > 0, n_0 > 0 such that

0 \leq f(n) \leq c g(n)

for all sufficiently large n \geq n_0

We say that f(n) is bounded above by g(n).

Example: 2n^2 = O(n^3)

Big O corresponds roughly to less than or equal to

Note that this equal sign notation is NOT symmetric - n^3 != O(n^2)

We can also think about a set definition - f(n) is in some set of functions that are like g(n)

O(g(n)) can be defined as a set of functions h(n) such that:

{h(n) : \exist c > 0, n_0, 0 \leq f(n) \leq c g(n)}

Macro Convention

Macro convention - a set in a function represents an anonymous function in that set.

Note that this observation is the key mathematical underpinning of functional programming and function-based languages.


f(n) = n^3 + O(n^2)

This means there is a function h(n) in the set O(N2) sucjh that f(n)= n3 + h(n)

they represent SOME func in that set

Can also read statements as having a lead "for all" and read the equal sign as "is":

n^2 + O(n) = O(n^2)

is a true statement, but the reverse is not ture.

This means, for any f(n) in O(n), there exists some function g(n) in O(n^2) such that if you add n^2 to the function f(n), you get g(n).

Big Omega Notation

f(n) = \Omega( g(n) )

means f(n) is AT LEAST cost g(n),

\exists c > 0, n_0 > 0 : 0 \leq g(n) \leq f(n)

for sufficiently large n.

Example: \sqrt{n} = \Omega(\log{n})

Big Theta Notation

f(n) = \Theta( g(n) ) = \Omega( g(n) ) \bigcap O(g(n))

means f(n) grows the same as g(n):

\exists c_1 > 0, c_2 > 0, n_0 > 0 : c_1 g(n) \leq f(n) \leq c_2 g(n)

for n >= n0


Example 1

Suppose we have a recursion relation:

T(n) = 2T(\frac{n}{2}) + 10n

For the master theorem, we start by evaluating log_b(a)

\log_{b}{a} = log_{2}{2} = 1

n^c = n

Comparing this with f(n) = 10n we find that these two functions have the same form. Specifically, we have:

f(n) = \Theta( n^c \log^{k}{n} )

for c = 1 and k = 0.

Now our final statement is that we have case 2 (split/merge and subproblem solution steps are balanced):

f(n) = \Theta( n \log n)

This corresponds to the case where the function n^c, which represents the computational cost of solving the subproblems, grows at the same rate as the merge step, f(n), which in this case was 10n.

Example 2

T(n) = 2 T(\frac{n}{2}) + \frac{n}{log n}

In this case, the Master Theorem does not apply - there is a non-polynomial difference between f(n) and n^{\log_b{a}}

Example 3

T(n) = 2T(\frac{n}{4}) + n^{0.51}

We start with calculating c = log_b a:

c = \log_b{a} = log_{4}{2} = \frac{1}{2}

So now we compare f(n) = n^{0.51} to n^c = \sqrt{n}

We can see that

f(n) = \Omega(n^{0.5})

so we have Case 3

WE need to check the regularity condition:

2 f( \frac{n}{4} ) \leq c f(n)

\dfrac{2}{2 + \dots} n^{0.51} < c n^{0.51}

which is true.

So the final answer is, Case 3:

T(n) = \Theta(n^{0.51})

Example 4

T(n) = 0.5 T(\frac{n}{2}) + \frac{1}{n}

Careful - this case does not apply! a < 1 violates one of the assumptions of the theorem.

Example 5

T(n) = 16 T(\frac{n}{4}) + n!

Intuitively, you can identify off the bat that the factorial is probably going to dominate. Proving it mathematically:

a = 16, b = 4, c = log_{b}{a} = 2

Now we compare f(n) = n! to n^c = n^2:

f(n) = \Omega(n^2)

We have case 3, so we check the regularity condition:

16 f(\frac{n}{4}) \leq c f(n)

16 \dfrac{n}{4}! \leq c n!

This holds for n>4, since 4! > 16

Therefore, final answer is:

Case 3

T(n) = \Theta(n!)

Example 6

T(n) = \sqrt{2} \quad T \left( \dfrac{n}{2} \right) + \log{n}

a = \sqrt{2} = 2^{\frac{1}{2}}, b = 2, \log_{2}{(2^{\frac{1}{2}}} = \frac{1}{2}

Now we compare f(n) = \log{n} with n^c = n^{\frac{1}{2}}

If we compare the graphs of these two functions, the square root grows slightly faster: \sqrt{n} = O(\log{n})

That means n^c provides an upper bound, and f(n) = O(n^c) = O(\sqrt{n})

Final answer:

Case 1

T(n) = \Theta(\sqrt{n})

Example 7

T(n) = 3 T \left( \frac{n}{2} \right) + n

Using values of a and b, \log_{b}{a} = \log{3} \approx 1.5

Now we compare f(n) = n with n^{\log{3}}

f(n) \leq n^c

Therefore we have Case 1:

T(n) = \Theta(n^{\log{3}})