Week 2 - Asymptotic Notation

Assignment - Due 10/06/2018

Credit: source code of questions are from walkccc.

3.1-1

Let $f(n) + g(n)$ be asymptotically nonnegative functions. Using the basic definition of $\Theta$-notation, prove that $\max(f(n), g(n)) = \Theta(f(n) + g(n))$.

Given $f(n) \geq 0$ and $g(n) \geq 0$:

\begin{align*} 0 \leq \frac{1}{2}\times\big(f(n)+g(n)\big) \leq \max\big(f(n), g(n)\big) \leq 2\times(f(n) + g(n)) \end{align*} So, let $c_1 = \dfrac{1}{2}, c_2 = 1$: \begin{align*} 0 \leq c_1 \times \big(f(n) + g(n)\big) \leq \max\big(f(n), g(n)\big) \leq c_2 \times \big(f(n) + g(n)\big) \end{align*} which is by definition $\max\big(f(n), g(n)\big) = \Theta\big(f(n) + g(n)\big)$.

3.1-2

Show that for any real constants $a$ and $b$, where $b > 0$,

(n + a)^b = \Theta(n^b). \tag{3.2}

We want to find the $c_1, c_2$ and $n_0$ that make $0 < {c_1}{n^b} \leq (n + a)^b \leq {c_2}{n^b}$ stands. Because $a, b$ are real constant, let $n_0 = |\frac{a}{d_1}|$:

n + a \geq n - |a| \geq n - n_0|d_1| \geq n - n|d_1| = (1 - |d_1|)n \text{, when } n \geq n_0
n + a \leq n + |a| \leq n + n_0|d_1| \leq n + n|d_1| = (1 + |d_1|)n \text{, when } n \geq n_0

and can be combined when $n \geq n_0$:

0 < (1 - |d_1|)n \leq n + a \leq (1 + |d_1|)n

Because $b > 0$, it holds to the power $b$:

0 < [(1 - |d_1|)n]^b \leq (n + a)^b \leq [(1 + |d_1|)n]^b

that is:

0 < (1 - |d_1|)^bn^b \leq (n + a)^b \leq (1 + |d_1|)^bn^b

Let $d_1 = \dfrac{1}{2}, c_1 = (\dfrac{1}{2})^b, n_0 = 2|a|$; let $d_1 = 1, c_2 = 2^b, n_0 = |a|$;

So for $n \geq n_0$, $(n + a)^b = \Theta{(n^b)}$ holds when $c_1 = (\dfrac{1}{2})^b, c_2 = 2^b$ and $n_0 = 2|a|$.

3.1-3

Explain why the statement, ''The running time of algorithm $A$ is at least $O(n^2)$,'' is meaningless.

Because $O(n^2)$ is a set of functions that are asymptotically less than $n^2$, so saying "running time is at least $O(n^2)$" can be anything greater than zero.

3.1-4

Is $2^{n + 1} = O(2^n)$? Is $2^{2n} = O(2^n)$?

• Yes, $2^{n + 1} = 2\times{2^n} = O(2^n)$.

• No, $\lim_{n\to\infty}\frac{2^n}{2^{2n}} = \lim_{n\to\infty}\frac{1}{2^n} = 0$, which means $2^{2n} = \Omega(2^n)$.

3.1-6

Prove that the running time of an algorithm is $\Theta(g(n))$ if and only if its worst-case running time is $O(g(n))$ and its best-case running time is $\Omega(g(n))$.

Given

f(n) = O(g(n))
f(n) = \Omega(g(n))

then $\exists c_1 > 0, c_2 > 0> 0$ such that $0 \leq c_1{g(n)} \leq f(n) \leq c_2{g(n)}$, so $f(n) = \Theta(g(n))$.

Given

f(n) = \Theta(g(n))

then $\exists c_3 > 0, c_4 > 0> 0$

• $0 \leq c_3{g(n)} \leq f(n)$, which is $f(n) = \Omega(g(n))$.
• $0 \leq f(n) \leq c_4{g(n)}$, which is $f(n) = O(g(n))$.

3.1-7

Prove $o(g(n)) \cap w(g(n))$ is the empty set.

Suppose $o(g(n))\cap\omega(g(n))$ is not the empty set, then

\exists f(n) \in o(g(n)) \cap \omega(g(n))
f(n) = o(g(n))
f(n) = \omega(g(n))

where is $\forall c > 0, \forall d > 0$:

cg(n) < f(n) < dg(n)

So we let $c = d$, then $cg(n) < f(n) < cg(n)$, where there is no solutions. Thus the assumption is wrong, $o(g(n))\cap\omega(g(n))$ is the empty set.

3.1-8

We can extend our notation to the case of two parameters $n$ and $m$ that can go to infinity independently at different rates. For a given function $g(n, m)$ we denote $O(g(n, m))$ the set of functions:

\begin{align} O(g(n, m)) = \{f(n, m): & \text{ there exist positive constants } c, n_0, \text{ and } m_0 \\ & \text{ such that } 0 \le f(n, m) \le cg(n, m) \\ & \text{ for all } n \ge n_0 \text{ or } m \ge m_0.\} \end{align}

Give corresponding definitions for $\Omega(g(n, m))$ and $\Theta(g(n, m))$.

\begin{align*} \Omega(g(n, m)) = \{f(n, m): &\text{there exist positive constant }c, n_0, m_0\\ &\text{such that } 0 \leq cg(n, m) \leq f(n, m)\\ &\text{for all }n \geq n_0 \text{ or } m \geq m_0\}\\ \Theta(g(n, m)) = \{f(n, m): &\text{there exist positive constant }c_1, c_2, n_0, m_0\\ &\text{such that } 0 \leq c_1g(n, m) \leq f(n, m) \leq c_2g(n, m)\\ &\text{for all }n \geq n_0 \text{ or } m \geq m_0\} \end{align*}

3.1

Let $p(n) = \sum_{i = 0}^d a_i n^i$, where $a_d > 0$, be a degree-$d$ polynomial in $n$, and let $k$ be a constant. Use the definitions of the asymptotic notations to prove the following properties.

a. If $k \ge d$, then $p(n) = O(n^k)$.

b. If $k \le d$, then $p(n) = \Omega(n^k)$.

c. If $k = d$, then $p(n) = \Theta(n^k)$.

d. If $k > d$, then $p(n) = o(n^k)$.

e. If $k < d$, then $p(n) = \omega(n^k)$.

3.2

Indicate for each pair of expressions $(A, B)$ in the table below, whether $A$ is $O$, $o$, $\Omega$, $\omega$, or $\Theta$ of $B$. Assume that $k \ge 1$, $\epsilon > 0$, and $c > 1$ are constants. Your answer should be in the form of the table with ''yes'' or ''no'' written in each box.

\begin{array}{ll|ccccc} \hline \hline A &B &A = O(B) &A = o(B) &A =\Omega(B) &A =\omega(B) &A =\Theta(B)\\ \hline \lg^k{n} &n^{\epsilon} & \text{Yes} & \text{Yes} & \text{No} & \text{No} & \text{No} \\ n^k &c^n & \text{Yes} & \text{Yes} & \text{No} & \text{No} & \text{No} \\ \sqrt{n} &n^{\sin{n}} & \text{No} & \text{No} & \text{No} & \text{No} & \text{No} \\ 2^n &2^{n/2} & \text{No} & \text{No} & \text{Yes} & \text{Yes} & \text{No} \\ n^{\lg{c}} &c^{\lg{n}} & \text{Yes} & \text{No} & \text{Yes} & \text{No} & \text{Yes} \\ \lg(n!) &\lg(n^n) & \text{Yes} & \text{No} & \text{Yes} & \text{No} & \text{Yes} \\ \hline \hline \end{array}

3.4

Let $f(n)$ and $g(n)$ by asymptotically positive functions. Prove or disprove each of the following conjectures.

a. $f(n) = O(g(n))$ implies $g(n) = O(f(n))$.

b. $f(n) + g(n) = \Theta(\min(f(n), g(n)))$.

c. $f(n) = O(g(n))$ implies $\lg(f(n)) = O(\lg(g(n)))$, where $\lg(g(n)) \ge 1$ and $f(n) \ge 1$ for all sufficiently large $n$.

d. $f(n) = O(g(n))$ implies $2^{f(n)} = O(2^{g(n)})$.

e. $f(n) = O((f(n))^2)$.

f. $f(n) = O(g(n))$ implies $g(n) = \Omega(f(n))$.

g. $f(n) = \Theta(f(n / 2))$.

h. $f(n) + o(f(n)) = \Theta(f(n))$.