top of page
Writer's picturepooja dandir

Time Complexity in java coding - Big O

Big O is a way of measuring how efficient your code is.


There are two ways we measure the complexity of a code: Time and Space.

Time Complexity: The time Complexity is measured in the number of operations that it takes to complete something.

Space Complexity: The space Complexity of an algorithm is the total space taken by the algorithm with respect to the input size.


So when dealing with time and space complexity, you will see these three Greek letters.

They are Omega, Theta and Omicron. Omicron is better known as O as in big O.

To see how we use these, Lets iterate an array using a loop as shown above with 7 elements.

So when we're looking at how many times that we have to run the loop, our best case would be if we're looking for the first number in the array which is 1, our worst case would be if we were looking for the last number in array i.e., 7 . And our average case would be if we were looking for the middle number in array i.e., 4.


So the best case we use omega.

For the average case we use theta.

And for our worst case we use omicron or O.


Technically big O is always worst case, but generally you will hear people use it for all the three cases particular to time complexity. You will hear what is your average big O or your best case, big O?


We are going to understand how to calculate time complexity of a code.

This efficiency is going to be the language of Big O.


O(n) - Proportion:


In the code below to print 7 or n elements, we need to iterate the loop 7 or n times. That means the number of operations that the code takes to print n numbers is equal to the number of inputs. so in time complexity it is measured as O(n).

public class CalculateBigO {
	public static void main(String[] args) {
		printElements(7);
	}
	private static void printElements(int n) {
		for(int i = 0; i < n; i++ ) {
			System.out.println(i);
		}
	}
}

Let's look at another code:

public class CalculateBigO {
	public static void main(String[] args) {
		printElements(7);
	}
	private static void printElements(int n) {
		for(int i = 0; i < n; i++ ) {
			System.out.println(i);
		}
		for(int i = 0; i < n; i++ ) {
			System.out.println(i);
		}
	}
}

What will be the time complexity of code above. O(2n)?


No, The time complexity in this scenerion will also be same O(n). As we should drop constants while calculating it, whether the number is only 2 or thousand. So drop constant is one rule to simplify this calculation.


O(n^2) - Loop within loop:


Let's look at another code with loop inside a loop:


public class CalculateBigO {
	public static void main(String[] args) {
		printElements(7);
	}
	private static void printElements(int n) {
		for(int i = 0; i < n; i++ ) {
			for(int j = 0; j < n; j++ ) {
				System.out.println(i+" "+j);
			}
		}
	}
}

In the code above, the loop inside runs n times for each outer loop which also runs n times. so the time complexity will be calculated as O(n*n), i.e. O(n^2).

It is way more than O(n).

So if you have a situation where your code is O(n^2), and there's a way of rewriting the code to make it O(n), that is a huge gain in efficiency.


Let's see another code :


private static void printElements(int n) {	
	for(int i = 0; i < n; i++ ) {
		for(int j = 0; j < n; j++ ) {
			System.out.println(i+" "+j);
		}
	}
	for(int i = 0; i < n; i++ ) {
		System.out.println(i);
	}
}

Here, we have two complexity as O(n^2) and O(n), which combines to O(n^2 + n). The n^2 is dominent over n. n^2 will grow much faster as compared to n. So we use another rule of simplicity as drop non-dominents while calculating time complexity. So the time complexity of above code is O(n^2).


O(1) - Constant:


Let's see another code:


public static void main(String[] args) {
	System.out.println(addElements(1000000000));
}
private static int addElements(int n) {	
	return n + n;
}

In the code above the time complexity of addElements() is O(1). The add operation will be performed only once, it doesn't depend on the number of inputs.


There is another example of O(1):


public static void main(String[] args) {
	System.out.println(addElements(1000000000));
}
private static int addElements(int n) {	
	return n + n + n;
}

As n grows, the number of operations will stay the same. Code is adding elements only once, so the time complexity of above code will be calculated to O(2), which is simplified further to O(1).


O(1) is also called constant time. It is the most efficient big O.


O(log n) - Divide and Conquer:


Let's looks at another scenerio for O(log n) time complexity.


For this let's take an example of sorted array and assume we are searching for a number in that sorted array. That array can be as long as a million elements array. For visualizing purpose i am taking an array with 8 elements.

To search/look for 1 in above array, let's divide the array in two parts as:

Since 1 can't be in second part, discard it and divide the first part agian.

 1 can't be in second part again, discard it and divide the first part one more time.

After dividing the array thrice we could search 1 in left of the remaining array element.


To calculate the time complexity of it, if we go mathemetically we did 3 operations on 8 elements, and it just so happens that two to the third power is eight. To convert it in logarithm equation. it becomes log 8 = 3, log 2 of 8 is 3.


so, the time complexity to search an element in sorted array is O (log n). It is the next best to O(1).


The code for above explanation in java is written below, where arr[] is array of elements and k is the element we are searching in array :

boolean searchInSorted(int arr[], int k) {
	if(arr.length == 0 || arr == null) {
      	return false;
     }
        
     int start = 0;
     int end = arr.length -1;
     int mid = 0;
      
     while(start <= end) {
        	mid = start + (end - start)/2;
        	
        	if(arr[mid] == k) { 	// Exact match
        		return true;
        	}else if(k < arr[mid]){ // search on left
        		end = mid - 1;
        	}else {					// search on right
        		start = mid + 1 ;
        	}
     }
        
     return false;
}

O(a+b):


Now let's find out the time complexity, the Big O of code below with different arguments:

public static void main(String[] args) {
	printElements(7, 100);
}
private static void printElements(int a, int b) {
	for(int i = 0; i < a; i++ ) {
		System.out.println(i);
	}
	for(int i = 0; i < b; i++ ) {
		System.out.println(i);
	}
}
	

What will be the Big O of code above? One can be tricked with the code with different arguments. It is O(a+b),

as the argument a and b can be of different values, one can be very big like in thousands and another can be small in hundrends.

O(a*b):


Simillarly the Big O of code below will be O(a*b).


private static void printElements(int a, int b) {	
	for(int i = 0; i < a; i++ ) {
		for(int j = 0; j < b; j++ ) {
			System.out.println(i+" "+j);
		}
	}
}

Now you know how to calculate the time complexity of code. Please Refer https://www.bigocheatsheet.com/ for Big-O of different algorithms and take decisions based on requirement.

6 views

Recent Posts

See All
bottom of page