Useful techniques to get $\Theta (f(N))$
Relevant pages in 61B Textbook
This is just a condensed(?) version of the textbook.
Ex 1: A For Loop
|
|
- Method 1: Count Number of Operations (exact count)
The total cost/operations of “==” amounts to: $$ C(N)=1+2+3+\dots+(N-2)+(N-1) = N(N-1)/2 $$ Therefore, the runtime would be $\Theta (N^2)$.
- Method 2: Geometric Visualization
(Imagine the colored blocks - since it’s a triangular area - the runtime is $\Theta (N^2)$.)
Ex 2: Another For Loop
|
|
- Method 1: Finding the Bound Visually
Graph the trajectory of 0.5 N (lower dashed line), and 4N (upper dashed line)
And the C(N) would be between these two lines
Therefore, the runtime is $\Theta (N)$.
(Personally I think it’s not as straighforward as the mathematical method)
- Method 2: Finding the Bound Mathematically
Imagine the boxes:
So C(N) would be: $$ C(N)=1+2+4+\dots+N= 2N-1 $$ Therefore, the runtime is $\Theta (N)$.
Ex 3: Tree Recursion
|
|
- Method 1: Intuition
Every time we add one to
n
we double the amount of work that has to be done… which results in the intuitive answer for runtime to be $2^N$.
- Method 2: Algebra
$C(N)$ amounts to: $$ C(N) = 1+2+4+\dots+2^\left(N-1\right)=2^N-1 $$ So the runtime is $\Theta (2^N)$.
- Method 3: Recurrence Relation (Out of Scope)
Starting from this: $$ C(N)=2C(N-1)+1 $$
Ex 4: Binary Search
- Method 1: intuition
We start with N options, then N/2, then N/4 … until we have just 1.
Each time, we cut the array in half, so in the end we must perform a total of $\log_{2}(N)$operations…
So the overall runtime then is order
- Method 2: exact count
So $C(N)=floor(log_{2}(N))+1$ (based on observation).
And the runtime is $\Theta(log(N))$.
Ex 5: Mergesort
- arbitrary units of time: relative sense of the time needed
e.g. If we run an N=6 selection sort, and the runtime is of order $N^2$, it will take ~36 AU to run.
- About Mergesort – a simple case (merge two lists into one)
Its runtime is $\Theta (N)$.
- Mergesort – applying mergesort to sort a long list
The key idea is to split the original array into smallest pieces and apply mergesort (whose runtime is $\Theta (N)$).
So Mergesort has worst case runtime $\Theta (N * log(N))$, where
The top level takes ~N AU.
Next level takes ~N/2 + ~N/2 = ~N.
One more level down: ~N/4 + ~N/4 + ~N/4 + ~N/4 = ~N.
Summary
- There is no magic. Just count.
- A more standardized method is shown in
andwelcome()
in disc07:- Step 1: Calculate the tree height (number of layers), e.g., for a binary tree, it is $logN$.
- Step 2: Calculate the branching factor (nodes per layer), e.g., for a binary tree, it is $2^N$.
- Step 3: Calculate the operations per node, e.g., for a binary tree, it is $N/2^i$.
- Finally, multiply the results together: $\sum_{i=0}^{logN} \cdot 2^i \cdot (\frac{N}{2^i}) = N\log{N}$.