## Statement of Master Theorem

The Master Theorem is used to analyze three distinct cases for divide-and-conquer recurrences:

Case 1: Subproblems Dominate (Leaf-Heavy)

If ${\displaystyle f(n)=O(n^{\log _{b}{(a-\epsilon )}})}$ for some constant ${\displaystyle \epsilon >0}$ then ${\displaystyle T(n)=\Theta (n^{\log _{b}{a}})}$

Case 2: Split/Recombine ~ Subproblems

If ${\displaystyle f(n)=\Theta (n^{\log _{b}{a}}\log ^{k}{(x)})}$, then ${\displaystyle T(n)=\Theta (n^{\log _{b}{a}}\log ^{k+1}{n})}$ (note that usually ${\displaystyle k=0}$)

Case 3: Split/Recombine Dominates (Root Heavy)

If ${\displaystyle f(n)=\Omega (n^{\log _{b}{(a+\epsilon )}})}$ for some constant ${\displaystyle \epsilon >0}$ , and if ${\displaystyle af(n/b)\leq cf(n)}$ for some ${\displaystyle c<1}$, then ${\displaystyle T(n)=\Theta (f(n))}$

## Revisiting Big O, Big Theta, Big Omega

### Big O

When we say

${\displaystyle f(n)=O(g(n))}$

what we mean is, there are some constants ${\displaystyle c>0,n_{0}>0}$ such that

${\displaystyle 0\leq f(n)\leq cg(n)}$

for all sufficiently large ${\displaystyle n\geq n_{0}}$

We say that f(n) is bounded above by g(n).

Example: ${\displaystyle 2n^{2}=O(n^{3})}$

Big O corresponds roughly to less than or equal to

Note that this equal sign notation is NOT symmetric - n^3 != O(n^2)

We can also think about a set definition - f(n) is in some set of functions that are like g(n)

O(g(n)) can be defined as a set of functions h(n) such that:

${\displaystyle {h(n):\exists c>0,n_{0},0\leq f(n)\leq cg(n)}}$

### Macro Convention

Macro convention - a set in a function represents an anonymous function in that set.

Note that this observation is the key mathematical underpinning of functional programming and function-based languages.

Example:

${\displaystyle f(n)=n^{3}+O(n^{2})}$

This means there is a function h(n) in the set O(N2) sucjh that f(n)= n3 + h(n)

they represent SOME func in that set

Can also read statements as having a lead "for all" and read the equal sign as "is":

${\displaystyle n^{2}+O(n)=O(n^{2})}$

is a true statement, but the reverse is not ture.

This means, for any f(n) in O(n), there exists some function g(n) in O(n^2) such that if you add n^2 to the function f(n), you get g(n).

### Big Omega Notation

${\displaystyle f(n)=\Omega (g(n))}$

means f(n) is AT LEAST cost g(n),

${\displaystyle \exists c>0,n_{0}>0:0\leq g(n)\leq f(n)}$

for sufficiently large n.

Example: ${\displaystyle {\sqrt {n}}=\Omega (\log {n})}$

### Big Theta Notation

${\displaystyle f(n)=\Theta (g(n))=\Omega (g(n))\bigcap O(g(n))}$

means f(n) grows the same as g(n):

${\displaystyle \exists c_{1}>0,c_{2}>0,n_{0}>0:c_{1}g(n)\leq f(n)\leq c_{2}g(n)}$

for n >= n0

## Examples

### Example 1

Suppose we have a recursion relation:

${\displaystyle T(n)=2T({\frac {n}{2}})+10n}$

For the master theorem, we start by evaluating log_b(a)

${\displaystyle \log _{b}{a}=log_{2}{2}=1}$

${\displaystyle n^{c}=n}$

Comparing this with ${\displaystyle f(n)=10n}$ we find that these two functions have the same form. Specifically, we have:

${\displaystyle f(n)=\Theta (n^{c}\log ^{k}{n})}$

for c = 1 and k = 0.

Now our final statement is that we have case 2 (split/merge and subproblem solution steps are balanced):

${\displaystyle f(n)=\Theta (n\log n)}$

This corresponds to the case where the function ${\displaystyle n^{c}}$, which represents the computational cost of solving the subproblems, grows at the same rate as the merge step, f(n), which in this case was 10n.

### Example 2

${\displaystyle T(n)=2T({\frac {n}{2}})+{\frac {n}{logn}}}$

In this case, the Master Theorem does not apply - there is a non-polynomial difference between ${\displaystyle f(n)}$ and ${\displaystyle n^{\log _{b}{a}}}$

### Example 3

${\displaystyle T(n)=2T({\frac {n}{4}})+n^{0.51}}$

${\displaystyle c=\log _{b}{a}=log_{4}{2}={\frac {1}{2}}}$

So now we compare ${\displaystyle f(n)=n^{0.51}}$ to ${\displaystyle n^{c}={\sqrt {n}}}$

We can see that

${\displaystyle f(n)=\Omega (n^{0.5})}$

so we have Case 3

WE need to check the regularity condition:

${\displaystyle 2f({\frac {n}{4}})\leq cf(n)}$

${\displaystyle {\dfrac {2}{2+\dots }}n^{0.51}

which is true.

So the final answer is, Case 3:

${\displaystyle T(n)=\Theta (n^{0.51})}$

### Example 4

${\displaystyle T(n)=0.5T({\frac {n}{2}})+{\frac {1}{n}}}$

Careful - this case does not apply! ${\displaystyle a<1}$ violates one of the assumptions of the theorem.

### Example 5

${\displaystyle T(n)=16T({\frac {n}{4}})+n!}$

Intuitively, you can identify off the bat that the factorial is probably going to dominate. Proving it mathematically:

${\displaystyle a=16,b=4,c=log_{b}{a}=2}$

Now we compare ${\displaystyle f(n)=n!}$ to ${\displaystyle n^{c}=n^{2}}$:

${\displaystyle f(n)=\Omega (n^{2})}$

We have case 3, so we check the regularity condition:

${\displaystyle 16f({\frac {n}{4}})\leq cf(n)[itex][itex]16{\dfrac {n}{4}}!\leq cn!}$

This holds for ${\displaystyle n>4}$, since ${\displaystyle 4!>16}$

Case 3

${\displaystyle T(n)=\Theta (n!)}$

### Example 6

${\displaystyle T(n)={\sqrt {2}}\quad T\left({\dfrac {n}{2}}\right)+\log {n}}$

${\displaystyle a={\sqrt {2}}=2^{\frac {1}{2}},b=2,\log _{2}{(2^{\frac {1}{2}}}={\frac {1}{2}}}$

Now we compare ${\displaystyle f(n)=\log {n}}$ with ${\displaystyle n^{c}=n^{\frac {1}{2}}}$

If we compare the graphs of these two functions, the square root grows slightly faster: ${\displaystyle {\sqrt {n}}=O(\log {n})}$

That means ${\displaystyle n^{c}}$ provides an upper bound, and ${\displaystyle f(n)=O(n^{c})=O({\sqrt {n}})}$

Case 1

${\displaystyle T(n)=\Theta ({\sqrt {n}})}$

### Example 7

${\displaystyle T(n)=3T\left({\frac {n}{2}}\right)+n}$

Using values of a and b, ${\displaystyle \log _{b}{a}=\log {3}\approx 1.5}$

Now we compare ${\displaystyle f(n)=n}$ with ${\displaystyle n^{\log {3}}}$

${\displaystyle f(n)\leq n^{c}}$

Therefore we have Case 1:

${\displaystyle T(n)=\Theta (n^{\log {3}})}$