From charlesreid1

(Redirected from Master Theorem)

Statement of Master Theorem

The Master Theorem is used to analyze three distinct cases for divide-and-conquer recurrences:

Case 1: Subproblems Dominate (Leaf-Heavy)

If for some constant then

Case 2: Split/Recombine ~ Subproblems

If , then (note that usually )

Case 3: Split/Recombine Dominates (Root Heavy)

If for some constant , and if for some , then

Revisiting Big O, Big Theta, Big Omega

Big O

When we say

what we mean is, there are some constants such that

for all sufficiently large

We say that f(n) is bounded above by g(n).

Example:

Big O corresponds roughly to less than or equal to

Note that this equal sign notation is NOT symmetric - n^3 != O(n^2)

We can also think about a set definition - f(n) is in some set of functions that are like g(n)

O(g(n)) can be defined as a set of functions h(n) such that:

Macro Convention

Macro convention - a set in a function represents an anonymous function in that set.

Note that this observation is the key mathematical underpinning of functional programming and function-based languages.

Example:

This means there is a function h(n) in the set O(N2) sucjh that f(n)= n3 + h(n)

they represent SOME func in that set

Can also read statements as having a lead "for all" and read the equal sign as "is":

is a true statement, but the reverse is not ture.

This means, for any f(n) in O(n), there exists some function g(n) in O(n^2) such that if you add n^2 to the function f(n), you get g(n).

Big Omega Notation

means f(n) is AT LEAST cost g(n),

for sufficiently large n.

Example:

Big Theta Notation

means f(n) grows the same as g(n):

for n >= n0

Examples

Example 1

Suppose we have a recursion relation:

For the master theorem, we start by evaluating log_b(a)

Comparing this with we find that these two functions have the same form. Specifically, we have:

for c = 1 and k = 0.

Now our final statement is that we have case 2 (split/merge and subproblem solution steps are balanced):

This corresponds to the case where the function , which represents the computational cost of solving the subproblems, grows at the same rate as the merge step, f(n), which in this case was 10n.

Example 2

In this case, the Master Theorem does not apply - there is a non-polynomial difference between and

Example 3

We start with calculating c = log_b a:

So now we compare to

We can see that

so we have Case 3

WE need to check the regularity condition:

which is true.

So the final answer is, Case 3:

Example 4

Careful - this case does not apply! violates one of the assumptions of the theorem.

Example 5

Intuitively, you can identify off the bat that the factorial is probably going to dominate. Proving it mathematically:

Now we compare to :

We have case 3, so we check the regularity condition:

This holds for , since

Therefore, final answer is:

Case 3


Example 6

Now we compare with

If we compare the graphs of these two functions, the square root grows slightly faster:

That means provides an upper bound, and

Final answer:

Case 1

Example 7

Using values of a and b,

Now we compare with

Therefore we have Case 1:

Flags