Jun 8, 2020

Asymptotic Analysis | Space-Time Complexity | Coding Winds | We care about the future of future

Asymptotic Analysis


Hey people in this blog you are going to read about “Asymptotic notations”. We will go through Big –O notation, theta notation and omega notation.

What is asymptotic notation? Does it really help us while coding?

Oh well, it plays a great role. When we analyse the complexity of an algorithm in terms of space and time, we don’t get the exact figure required by the algorithm. Instead we use asymptotic notation. The word asymptotic means approaching a value or curve arbitrarily closely. Asymptotic notations, in short, remove the unimportant values that really don’t tell us anything about the running time of an algorithm.

Let us take an example, let an algorithm has a time complexity, 

T(n) = n2 + 2n + 5(quadratic equation). 

For large values of n, 2n+5 is insignificant as compared to n2.

Let’s look below,

a)    34n2 + 2n + 7

b)    n3 + 7n + 8

When we compare these two time complexities, we compare it by the term of n which has the largest power, i.e. n3 will take less time than n2(even when the constant is 34).

The main idea behind casting aside the less important part is to make things manageable.

BIG THETA (Ꙩ)

Often also called tight bounds, because the time complexity represented by Big-Ꙩ is like the average value or range within which the actual time of execution of the algorithm.

complexity of Θ(n2) means, that the average time for any input n will remain in between, k1 * n2 and k2 * n2, where k1, k2 are two constants, thereby tightly binding the expression representing the growth of the algorithm.




BIG O (Worst Case)

It is also known as the upper bounded by the algorithm. It tells us that that a certain function will never exceed a particular time for any value of input.

Why we need this representation when we already have Big Theta?

Consider a linear search system, in which we transverse the array elements, one by one to search a given input.

In the worst case, we start the search from the starting and assume we find the number at the last, which will lead to a time complexity of n, where n represents the number of total elements. But it can happen that the element, we were searching, is the first element of the array, which will lead to a time complexity of 1. In Big theta case, tight bound time complexity for linear search is Ꙩ(n). That is the time required will always be related to n, as it is the right way to represent the average time complexity. But when we use BIG-O notation, we know that time complexity will never exceed n.




BIG OMEGA (Ω)(best case)

Omega notation, tells us that the given function will take at least this much time to complete.

This always indicates the minimum time required for any algorithm for all input values, therefore the best case of any algorithm.



No comments:

Post a Comment