Time Complexity

Programmer
3 min readSep 2, 2019

--

Time taken to execute a program for mere mortals

What is Time Complexity?

Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, supposing that each elementary operation takes a fixed amount of time to perform. Thus, the amount of time taken and the number of elementary operations performed by the algorithm are taken to differ by at most a constant factor.

Source: wiki.

How to visualize time complexity?

O(1) — constant time 1

O(n) — is running a loop for n-times

for {i <- 1 to n} {...}

O(2n) —is running a loop for n-times but twice.

for {_ <- 1 to 2
i <- 1 to n} {...}

O(n²) — is running a loop of n times with nesting of another loop running n times.

for {i <- 1 to n
j <- 1 to n} {...}

O(n³) — similar to O(n²), but with one more loop of n-times. Hence, one can generalize it to O(n^k)

for {i <- 1 to n
j <- 1 to n
k <- 1 to n} {...}

O(log n) — this is the case when the result processing is halved. Most common in divide the problem by half. Binary search is a classical example of this.
💡Note: Here log base 2 is assumed.

def search(start: Int, end: Int): Boolean = {
val mid = start + (end-start)/2
if (start > end) false
else if (arr(mid) == name) true
else if (arr(mid) > name) search(start, mid-1)
else search(mid+1, end)
}
search(0, arr.length-1)
//assume that:
//arr is an array of sorted values
//name is the search string.

O(n*log n) — this is the case when the result processing is halved and each of the half is processed up to the number of elements. And since there are n elements, its n * log n. The classical merge sort is an example of this. In fact, divide and conquer follows this paradigm. The master theorem is based on this idea to provide a tuner for time complexity of this pattern of algorithms.

private def sort[T: Ordering](values: Array[T], aux: Array[T], low: Int, high: Int): Unit = {
if (high <= low) return
val mid = low + (high-low)/2
sort(values, aux, low, mid)
sort(values, aux, mid+1, high)
merge(values, aux, low, mid, high)
}
private def merge[T: Ordering](values: Array[T], aux: Array[T], low: Int, mid: Int, high: Int): Unit = {
var i = low
var j = mid + 1
Array.copy(values, low, aux, low, (high - low)+1)
import Ordering.Implicits._
for { k <- low to high } {
if (i > mid ) {
values(k) = aux(j)
j = j + 1
} else if (j > high) {
values(k) = aux(i)
i = i + 1
} else if (aux(j) < aux(i) ) {
values(k) = aux(j)
j = j + 1
} else {
values(k) = aux(i)
i = i + 1
}
}
}

O(2^n) — This is exactly opposite to O(log n). Meaning, in log n, we look at the half. With 2^… we look at 2 for one.
Note: This is not same as n*2. Because n*2 is 2n.
The classical Fibonacci recursion does not exactly do 2^n… but illustrates this idea. The following is a more accurate example of calculating 2^n.

object TwoPowerN {
def main(args: Array[String]): Unit = {
println(TwoPowerN(30))
}

def apply(n: Int) = {
def twoPower(n: Int): Int = {
if (n == 0) return 0 //base-case
if (n == 1) return 1
twoPower(n - 1) + twoPower(n - 1) //if the second recursion is twoPower(n-2) then its fib :)
}
twoPower(n) //execute.
}
}

More observations :

log(billion) = 9, this means that a search on a balanced b-tree takes about 9 hops to scan through a billion elements.

2³⁰ ~= 1 billion, meaning: 30 value iteration of 2^n reaches billion very quickly.

--

--

Programmer
Programmer

No responses yet