# Algorithm Analysis & Time Complexity Simplified

--

(Big-O, Big Theta, Big Omega)

An** algorithm **is any well defined procedure that takes some value or set of values as input and produces some value or set of values as output. *Source: Thomas H. Cormen, Chales E. Leiserson (2009), *Introduction to Algorithms 3rd edition. You can think of an algorithm like a recipe used to cook food, it’s just a set of instructions. In the case of making a cake using a recipe your input maybe eggs, sugar, flower etc. and your output is a cake !

**Algorithm analysis** is an important part of a broader computational complexity theory, which provides theoretical estimates for the resources (space and time)needed by any **algorithm** which solves a given computational problem.

In computer science, the **time complexity** of an algorithm gives the amount of time that it takes for an algorithm or program complete its execution, and is usually expressed by **Big-O (O)notation**. Big-O notation usually excludes any coefficients for example 4n²+1 in Big-O notation is O( n² ). The time complexity can also be expressed by **Big Omega (Ω) **and **Big Theta (Θ)**.

# What is Big-O (O), Big Omega (Ω) and Big Theta (Θ) ?

# Prove Big-O (O) Worst Case

Big-O, which is usually written as **O**, is an Asymptotic Notation for the worst case, which is also the ceiling of growth for a given function. It provides us with what is called an ** asymptotic upper bound** for the growth rate of runtime of an algorithm.

Suppose `f(n)`

is your algorithm runtime, and `g(n)`

is an arbitrary time complexity you are trying to relate to your algorithm. `f(n)`

is O(g(n)), if for some positive constants c and k, `f(n)`

<= `c * g(n)`

whenever n > k.

Let’s take a look at an **example**:

In the following video we want to show that or prove f(n) is Big O of g(n).

f(n) = n² + 2n + 1

g(n) = n²

So we want to show/prove that **n² + 2n + 1 is O(n²)**