# Big O Algorithm Analysis

Big O Algorithm Analysis Using Code Snippets

In computer science, algorithm analysis determines the amount of resources (such as time and/or storage) necessary to execute that algorithm or program. Usually, the running time or efficiency of an algorithm is represented as a function relating the input size to the number of steps (time complexity) or storage locations (space complexity).

An important part of computational complexity theory is algorithm analysis. Analysis of algorithms gives us theoretical estimates for the resources that are needed by an algorithm to solve a problem. With these estimates we gain insight into efficient algorithms like sorting and searching.

Big O notation, Big-omega notation and Big-theta notation are asymptotic used in the theoretical analysis of algorithms, which is used to estimate the complexity of an algorithm for some arbitrarily large input size.

Let us take a look at some common running times and source code snippets below.

# O(1)

The statement takes some constant amount of time so it is O(1)

`int` `y= n + 25;`

# O(n)

If the for loop takes n times to complete then it is O(n)

`for(int` `i=0;i<n;i++){sum++;}`

# O(n²)

If the nested loops contain sizes n and n, the cost is O(n*n) or O(n²)

`for(int i=0;i<n;i++)  for(int j=0;j<n;j++)    sum++;`

# O(log n)

If the for loop takes n time and i increases or decreases by a multiple, the cost is O(log(n))

`for(int i=0;i<n; i*=2) sum++;`

# O(1)

Your initial thoughts for this code snippet may be that it should be O(n), but it will only check if “j<n” once and find that n is not less than n, so this runs a constant amount of time.

`for(int j=n; j<n; j++)sum ++;`

# O(1)

Your initial thoughts for this code snippet may be that it should be O(log n), but notice that for loops starts j at ‘0’ so it will never be greater than ’n’, hence it runs a constant amount of time. Wait, but what if n is a negative number, well according to the definition of Big -O, n must be positive.

`for(int j=0; j>n; j*=2)    sum++;`

# O(n)

Your initial thoughts for this code snippet might be O(n * log(n)) , but this is actually O(n), see the video explaining why below.

`for(int i=n; i>0; i/=2)   for(int j=0; j<i; j++)      count++;`

# O(n⁴)

This code below looks crazy, and you may think the running time of this code is O(n³) because of the three for loops, however it is O(n⁴). Notice the first loop runs n*n times which is O(n²), the second loop runs i times , and i runs O(n²) times, the last loop runs 6 times or O(1). So we get O(n²) * O(n²) * O(1) , which is O(n⁴). You can also see the video below explaining why.

`for(int i=1; i<n*n; i++)    for(int j=1; j≤i; j++) 	for(int k=1; k≤ 6; k++) 	   sum ++;`

For more videos on Algorithm Analysis you can take an algorithm analysis course on Udemy:

I also have a book on Algorithm Analysis on Amazon Kindle.

Thanks for reading this article I hope its helpful to you all ! Keep up the learning, and if you would like more computer science, programming and algorithm analysis videos please visit and subscribe to my YouTube channels (randerson112358 & compsci112358 )

# Check Out the following for content / videos on Computer Science, Algorithm Analysis, Programming and Logic:

compsci112358:

Video Tutorials on Recurrence Relation:

Video Tutorial on Algorithm Analysis:
https://www.udemy.com/algorithm-analysis/

Written by