### Time Complexity, Space Complexity, Asymptotic Notations - The Coding Shala

Home >> Algorithms >> Time and space complexity

## Time Complexity, Space Complexity, and Asymptotic Notations

Time complexity describes the time taken by an algorithm and space complexity describes the memory used by an algorithm. Asymptotic Notations are languages that allow us to calculate time complexity and space complexity. Big O is most commonly used for time complexity or analysis of algorithms.

## Time Complexity

In Computer Science, the time complexity of an algorithm quantifies the amount of time taken by an algorithm to run as a function of the length of the string representing the input. The time complexity of an algorithm is commonly expressed using big O notation, which excludes coefficients and lowers order terms.[From Wikipedia] In a simple way we can say how much time our code takes to execute.

So, How to calculate the running time of an algorithm?

We can have three cases to analyze an algorithm:

- Worst Case
- Average Case
- Best Case

Generally, we calculate the worst-case to show the time complexity of an algorithm. In the worst-case analysis, we calculate the upper bound on the running time of an algorithm. Here, upper bound means our code won't take more time than we have described for it.

### Asymptotic Notations

Asymptotic notations are mathematical tools to represent the time complexity of the algorithm. The following 3 asymptotic notation is mostly used to represent the time complexity of the algorithm.

#### Theta Notation(Θ):

The theta notation bounds a function from above and below, so it defines exact asymptotic behavior. For a given function g(n), we denote Θ(g(n)) is following set of functions.

Θ(g(n)) = { f(n): there exist positive constants c1, c2 and n0 such that

0 <= c1*g(n) <= f(n) <= c2*g(n)

for all n >= n0 }

The above definition means, if f(n) is theta of g(n), then the value f(n) is always between c1*g(n) and c2*g(n) for large values of n (n >= n0). The definition of theta also requires that f(n) must be non-negative for values of n greater than n0.

#### Big O Notation(O):

The Big O notation defines an upper bound of an algorithm, it bounds a function only from above.

O(g(n)) = { f(n): there exist positive constants c and

n0 such that 0 <= f(n) <= cg(n) for

all n >= n0 }

#### Omega Notation(Ω):

Omega(Ω) notation provides an asymptotic lower bound.

Ω(g(n)) = { f(n): there exist positive constants c and

n0 such that 0 <= cg(n) <= f(n) for

all n >= n0 }

Let's see how to use these asymptotic notations.

#### Some rules to the analysis of loops

The following are some rules good to be remembered while we work with loops:

- The time complexity of a function is considered as O(1) if it doesn't contain a loop.
- The time complexity of a loop is considered as O(n) if the loop variables are incremented/decremented by a constant amount.
- The time complexity of nested loops is equal to the number of times the innermost statement is executed.
- Time Complexity of a loop is considered as O(Logn) if the loop variables are divided/multiplied by a constant amount.
- Time Complexity of a loop is considered as O(LogLogn) if the loop variables are reduced/increased exponentially by a constant amount.
- When there are consecutive loops, we calculate time complexity as the sum of time complexities of individual loops.

## Space Complexity

Space Complexity of an algorithm is the total space taken by the algorithm with respect to the input size. Space complexity includes both Auxiliary space(extra space or temporary space used by an algorithm) and space used by input.

#### Time and Space Complexity Example

Example 1:

What is the time, space complexity of the following code?

int a = 0, b = 0; for (i = 0; i < N; i++) { a = a + rand(); } for (j = 0; j < M; j++) { b = b + rand();}

Assume that rand() is O(1) time, O(1) space function.

**Solution:**O(max(N,M)) or we can say O(N+M) time, O(1) space.

**Other Posts You May Like**

## Comments

## Post a Comment