# Algorithm Analysis & Time Complexity Simplified

(Big-O, Big Theta, Big Omega)

An** algorithm **is any well defined procedure that takes some value or set of values as input and produces some value or set of values as output. *Source: Thomas H. Cormen, Chales E. Leiserson (2009), *Introduction to Algorithms 3rd edition. You can think of an algorithm like a recipe used to cook food, it’s just a set of instructions. In the case of making a cake using a recipe your input maybe eggs, sugar, flower etc. and your output is a cake !

**Algorithm analysis** is an important part of a broader computational complexity theory, which provides theoretical estimates for the resources (space and time)needed by any **algorithm** which solves a given computational problem.

In computer science, the **time complexity** of an algorithm gives the amount of time that it takes for an algorithm or program complete its execution, and is usually expressed by **Big-O (O)notation**. Big-O notation usually excludes any coefficients for example 4n²+1 in Big-O notation is O( n² ). The time complexity can also be expressed by **Big Omega (Ω) **and **Big Theta (Θ)**.

# What is Big-O (O), Big Omega (Ω) and Big Theta (Θ) ?

# Prove Big-O (O) Worst Case

Big-O, which is usually written as **O**, is an Asymptotic Notation for the worst case, which is also the ceiling of growth for a given function. It provides us with what is called an ** asymptotic upper bound** for the growth rate of runtime of an algorithm.

Suppose `f(n)`

is your algorithm runtime, and `g(n)`

is an arbitrary time complexity you are trying to relate to your algorithm. `f(n)`

is O(g(n)), if for some positive constants c and k, `f(n)`

<= `c * g(n)`

whenever n > k.

Let’s take a look at an **example**:

In the following video we want to show that or prove f(n) is Big O of g(n).

f(n) = n² + 2n + 1

g(n) = n²

So we want to show/prove that **n² + 2n + 1 is O(n²)**

Let’s try and solve this. First we have f(n) = n² + 2n + 1 and g(n) = n², we therefore must show f(n) <= c * g(n) whenever n > k, that is

n² + 2n + 1 ≤ c * n² whenever n > k.

Since k must be a positive integer (greater than or equal to 1) let’s set it equal to 1 to get the following: n² + 2n + 1 ≤ c * n² whenever n ≥ 1.

Next we can do one of two things, either solve for our constant c > 0 or find a constant c >0 . Like the video above, we will solve for ‘c’, by rewriting our original equation to get: (n² + 2n + 1 ) / n² ≤ c whenever n ≥ 1.

NOTE: (n² + 2n + 1 ) / n² ≤ (n² + 2n² + n²) / n² whenever n ≥ 1, therefore:

(n² + 2n + 1 ) / n² ≤ (n² + 2n² + n²) / n² ≤ c whenever n ≥ 1.

So (n² + 2n² + n²) / n² ≤ c whenever n ≥ 1. Let’s solve this equation for our constant ‘c’.

=(n² + 2n² + n²) / n² ≤ c whenever n ≥ 1

=(1 + 2 + 1) / 1 ≤ c whenever n ≥ 1

= 4 ≤ c whenever n ≥ 1

= c = 4

Okay so now we have our constant c=4, let’s plug it back into our original equation to get: n² + 2n + 1 ≤ 4* n² whenever n >1 and you will find that for any value of n greater than 1 this equation is always true.

**So f(n) is O(g(n) ), that is n² + 2n + 1 is O(n²).**

# Prove Big Omega (Ω) Best Case

Big-Omega, which is usually written as **Ω**, is an Asymptotic Notation for the best case, which is also a floor growth rate for a given function. It provides us with an ** asymptotic lower bound** for the growth rate of runtime of an algorithm.

Suppose `f(n)`

belongs to Ω(g(n)), if for some positive constants c and k , `f(n)`

is >= `c *g(n)`

whenever input size n > k.

# Prove Big Theta (Θ) Average Case “AKA both Worst & Best Case”

Big Theta or just Theta, which is usually written as **Θ**, is an Asymptotic Notation to denote the ** asymptotically tight bound** on the growth rate of runtime of an algorithm.

Suppose `f(n)`

is Θ(g(n)), if for some positive constants c1, c2 and k , `c1 * g(n)`

is ≤ `f(n)`

and `f(n)`

≤ `c2 *g(n)`

whenever n > k.

∴ `f(n)`

is Θ(g(n)) implies `f(n)`

is O(g(n)) and `f(n)`

is Ω(g(n)).

There are many places to learn more about recurrence relation online, and at school. If you would like to learn more about Algorithm Analysis , you can take my online course here. I also have a course on Udemy.com called Recurrence Relation Made Easy where I help students to understand how to solve recurrence relations and asymptotic terms such as Big-O, Big Omega, and Theta. I also have a YouTube channel of videos where I solve recurrence relations that anyone can check out for free !

Thanks for reading this article I hope its helpful to you all ! Keep up the learning, and if you would like more computer science, programming and algorithm analysis videos please visit and subscribe to my YouTube channels (randerson112358 & compsci112358 )