Asymptotic Analysis is a measure of an algorithm’s order of growth (input size). The amount of time, storage, and other resources necessary to assess the efficiency of an algorithm are well known. As a result, the primary purpose of the asymptotic analysis is to evaluate the efficiency of algorithms that do not rely on machine-specific constants and do not need algorithm implementation or programme execution time comparison. For different types of inputs, an algorithm’s performance may vary. The performance will vary as the size of the input increases. The study of the changes in the performance of an algorithm when the order of the input size changes is known as asymptotic analysis

Index of Article (Click to Jump)

## How to Calculate Asymptotic Analysis?

To understand asymptotic analysis, let’s go through it with an example. Presume we’re interested in the characteristics of a function **f(n)** as n grows in size. If **f(n) = n^2 + 3n**, then as the value of **n** grows, **3n** becomes minimal in comparison to **n^2**. So, we can say that function **f(n)** is asymptotically equivalent to **n^2**, as **n → ∞ **which is expressed as **f(n) ~ n^2** symbolically.

### Rules for Asymptotic Analysis

**Loops:** The running time of a loop is calculated by multiplying ‘the number of iterations’ with ‘the running time of the statements within the loop.

**Nested Loops:** The total running time is equal to the sum of sizes of all the loops.

**Logarithmic complexity: **An algorithm is O(log n) if it takes a fixed amount of time to reduce the size problem by a fraction (generally 1/2). For example:

**for(int i=0;i<n;)**

**{**

** i=i*2;**

**}**

Here, according to above code, every time the value of** i** doubles.

Now, we have** 2^k=n.**

**log(2^k)=log n **(taking log on both sides)

**k log 2=log n**

**k=log n //assuming base is 2**

Total time = O(log n)

## Asymptotic Notations

To express the time complexity of algorithms for asymptotic analysis, mathematical tools used are Asymptotic notations. Following are three asymptotic notations, which are used to indicate time-complexity, each depends on three different cases, namely, best case, worst case, and average case:

**Big – O (Big Oh)**

Big-O notation represents the upper bound of the running time of an algorithm. It is the maximum time required by an algorithm for all input values. It represents the worst case time complexity of an algorithm.

O(g(n)) = { f(n): there exist positive constants c and n_{0, }such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n_{0} }

If there exists a positive constant c that lies between 0 and cg(n) for sufficiently large n, the above equation may be expressed as a function f(n), belonging to the set O(g(n)). The running time of an algorithm does not exceed the time specified by O(g(n) for any value of n. It is extensively used to study algorithms since it provides the worst-case running time. We are always interested in the worst-case situation.

**Big – Ω (Omega)**

The lower bound of an algorithm’s execution time is represented by the omega notation. It gives the minimum time required by an algorithm for all input values. It is the best case scenario for an algorithm’s time complexity.

Ω(g(n)) = { f(n): there exist positive constants c and n_{0, }such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n_{0} }

If there is a positive constant c that lies above cg(n) for sufficiently large n, the above statement can be expressed as a function f(n), belonging to the set (g(n)). Omega Ω(g(n)) is the minimum time required by the algorithm for any value of n.

**Big – Θ (Theta)**

Theta notation encloses the function from above and below. It denotes the upper and lower bounds of an algorithm’s execution time and represents the average case of an algorithm’s time complexity. For a function g(n), Θ(g(n)) is given by the relation:

Θ(g(n)) = { f(n): there exist positive constants c_{1}, c_{2} and n_{0, }such that 0 ≤ c_{1}g(n) ≤ f(n) ≤ c_{2}g(n) for all n ≥ n_{0} }

If there are positive constants c1 and c2 such that it can be positioned between c1g(n) and c2g(n) for sufficiently large n, the above statement may be represented as a function f(n), belonging to the set (g(n)). For all n>n0, a function f(n) is said to be asymptotically tight bound if it resides somewhere between c1g(n) and c2g(n).