Unit 3 Mathematical Aspects and
Analysis of Algorithms
Structure
3.1 Introduction
Objectives
3.2 Asymptotic Notations and Basic Efficiency Classes
Asymptotic notations
Basic asymptotic efficiency classes
3.3 Mathematical Analysis of Non-Recursive Algorithms
Analyzing efficiency of non-recursive algorithms
Matrix multiplication
Time efficiency of non-recursive algorithmns
Tower of Hanoi puzzle
Conversion of recursion algorithm to non-recursion algorithm
3.4 Summary
3.5 Glossary
3.6 Terminal Questions
3.7 Answers
3.1 Introduction
In the earlier unit you were introduced to the
concepts of analysis
framework. In this unit you will be learning about the basic concepts of
mathematical analysis of algorithms.
It is essential to check the
efficiency of each algorithm in order to select the
best algorithm. The efficiency is
generally measured by calculating the time
complexity of each algorithm. The shorthand way to
complexity is asymptotic notation. represent time
For simplicity, we can classify the algorithms into two
Non recursive algorithms categories as:
Recursive algorithms
We compute non-recursive algorithm only once to
solve the problem.
In this unit, we will
mathematically analyze non-recursive algorithms.
,Objectives:
After studying this unit you should be able to:
explain the types of asymptotic notations
list the basic asymptotic efficiency classes
describe the efficient analysis of non-recursive algorithms with
illustrations
3.2 Asymptotic Notations and Basic Efficiency Classes
To choose the best algorithm we need to check the efficiency of each
algorithm. Asymptotic notations describe different rate-of-growth relations
between the defining function and the defined set of functions. The order of
growth is not restricted by the asymptotic notations, and can also be
expressed by basic efficiency classes having certain characteristics.
Let us now discuss asymptotic notations of algorithms
3.2.1 Asymptotic notations
Asymptotic notation within the limit deals with the behavior of a function,
i.e. for sufficiently large values of its parameter. While analyzing the run time
of an algorithm, it is simpler for us to get an approximate formula for the run-
time.
The main characteristic of this approach is that we can neglect constant
factors and give importance to the terms that are present in the expression
(for T(n)) dominating the function's behavior whenever n becomes large.
This allows dividing of un-time functions into broad efficiency classes.
To give time complexity as "fastest possible", "slowest possible" or "average
time", asymptotic notations are used in algorithms. Various notations such
as Q (omega), (theta). O (big o) are known as asymptotic notations.
Big Oh notation (0)
O' is the representation for big oh notation. It is the method of denoting the
upper bound of the running time of an algorithm. Big Oh notation helps in
calculating the longest amount of time taken for the completion of algorithm.
A function T(n) is said to be in Ofh(n), denoted as T(n)EO(h(n), if T(n) is
bounded above by some constant multiple of h(n) for all large n, i.e., if there
exist some positive constant C and some non negative integer no such that
T(n sC h(n) for all n 2 no
, The graph of C h(n) and T(n) can be seen in the figure 3.1,. As n becomes
larger, the running time increases considerably. For example, consider
T(n)=13n+42n+2nlogn+4n. Here as the value on n increases n' is much
larger than n2, nlogn and n. Hence it dominates the function T(n) and we
can consider the running time to grow by the order of n'. Therefore it can be
written as T(n)=O(n*). The value of n for T(n) and C h(n) will not be less than
no.Therefore values less than no are considered as not relevant.
T(n)
Not Chn)
relevant
no
Figure 3.1: Big Oh Notation T(n) ¬ O[h(n))
Example
Consider function T(n)=1n+2 and h(n}=n*. Determine some constant C so
that t(n)sC*h(n).
T(n)=n+2 and h(n)=n?, find C for n=1, Now
T(n)=n+2
=(1) +2
T(n)=3 and h(n)=n?=12=1
T(n)>h(n)
If n=3
Then T(n)=(3)+2 = 5 and h(n)=n°=9
T(n)<h(n)
Hence for n>2, T(n)<h(n), Therefore Big Oh notation aliways gives the
existing upper bound.
Let us next discuss another asymptotic notation, the omega notation.
Analysis of Algorithms
Structure
3.1 Introduction
Objectives
3.2 Asymptotic Notations and Basic Efficiency Classes
Asymptotic notations
Basic asymptotic efficiency classes
3.3 Mathematical Analysis of Non-Recursive Algorithms
Analyzing efficiency of non-recursive algorithms
Matrix multiplication
Time efficiency of non-recursive algorithmns
Tower of Hanoi puzzle
Conversion of recursion algorithm to non-recursion algorithm
3.4 Summary
3.5 Glossary
3.6 Terminal Questions
3.7 Answers
3.1 Introduction
In the earlier unit you were introduced to the
concepts of analysis
framework. In this unit you will be learning about the basic concepts of
mathematical analysis of algorithms.
It is essential to check the
efficiency of each algorithm in order to select the
best algorithm. The efficiency is
generally measured by calculating the time
complexity of each algorithm. The shorthand way to
complexity is asymptotic notation. represent time
For simplicity, we can classify the algorithms into two
Non recursive algorithms categories as:
Recursive algorithms
We compute non-recursive algorithm only once to
solve the problem.
In this unit, we will
mathematically analyze non-recursive algorithms.
,Objectives:
After studying this unit you should be able to:
explain the types of asymptotic notations
list the basic asymptotic efficiency classes
describe the efficient analysis of non-recursive algorithms with
illustrations
3.2 Asymptotic Notations and Basic Efficiency Classes
To choose the best algorithm we need to check the efficiency of each
algorithm. Asymptotic notations describe different rate-of-growth relations
between the defining function and the defined set of functions. The order of
growth is not restricted by the asymptotic notations, and can also be
expressed by basic efficiency classes having certain characteristics.
Let us now discuss asymptotic notations of algorithms
3.2.1 Asymptotic notations
Asymptotic notation within the limit deals with the behavior of a function,
i.e. for sufficiently large values of its parameter. While analyzing the run time
of an algorithm, it is simpler for us to get an approximate formula for the run-
time.
The main characteristic of this approach is that we can neglect constant
factors and give importance to the terms that are present in the expression
(for T(n)) dominating the function's behavior whenever n becomes large.
This allows dividing of un-time functions into broad efficiency classes.
To give time complexity as "fastest possible", "slowest possible" or "average
time", asymptotic notations are used in algorithms. Various notations such
as Q (omega), (theta). O (big o) are known as asymptotic notations.
Big Oh notation (0)
O' is the representation for big oh notation. It is the method of denoting the
upper bound of the running time of an algorithm. Big Oh notation helps in
calculating the longest amount of time taken for the completion of algorithm.
A function T(n) is said to be in Ofh(n), denoted as T(n)EO(h(n), if T(n) is
bounded above by some constant multiple of h(n) for all large n, i.e., if there
exist some positive constant C and some non negative integer no such that
T(n sC h(n) for all n 2 no
, The graph of C h(n) and T(n) can be seen in the figure 3.1,. As n becomes
larger, the running time increases considerably. For example, consider
T(n)=13n+42n+2nlogn+4n. Here as the value on n increases n' is much
larger than n2, nlogn and n. Hence it dominates the function T(n) and we
can consider the running time to grow by the order of n'. Therefore it can be
written as T(n)=O(n*). The value of n for T(n) and C h(n) will not be less than
no.Therefore values less than no are considered as not relevant.
T(n)
Not Chn)
relevant
no
Figure 3.1: Big Oh Notation T(n) ¬ O[h(n))
Example
Consider function T(n)=1n+2 and h(n}=n*. Determine some constant C so
that t(n)sC*h(n).
T(n)=n+2 and h(n)=n?, find C for n=1, Now
T(n)=n+2
=(1) +2
T(n)=3 and h(n)=n?=12=1
T(n)>h(n)
If n=3
Then T(n)=(3)+2 = 5 and h(n)=n°=9
T(n)<h(n)
Hence for n>2, T(n)<h(n), Therefore Big Oh notation aliways gives the
existing upper bound.
Let us next discuss another asymptotic notation, the omega notation.