Algorithms analysis and design

1 / 16

# Algorithms analysis and design - PowerPoint PPT Presentation

Algorithms analysis and design. BY Lecturer: Aisha Dawood. Asymptotic notation. The notations we use to describe the asymptotic running time of an algorithm are defined in terms of functions whose domains are the set of natural numbers N = { 0 , 1, 2,... }.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

## PowerPoint Slideshow about ' Algorithms analysis and design' - cheche

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

### Algorithms analysis and design

BY

Lecturer: Aisha Dawood

Asymptotic notation
• The notations we use to describe the asymptotic running time of an algorithm are defined in terms of functions whose domains are the set of natural numbers N = { 0, 1, 2,... }.
• We will use asymptotic notation primarily to describe the running times of algorithms,
• as when we wrote that insertion sort’s worst-case running time is (n2).
Order of growth notations
• O notation ( Big-Oh)
• o notation (Little-oh)
• Ω-notation (Big – Omega)
• ω-notation (Little – Omega)
• -Notation (Theta)
(Big-OH) notation
• Onotation (Big – Oh)
• For a given function g(n), we denote by O(g(n)) (pronounced “big-oh of g of n” or sometimes just “oh of g of n”) the set of functions:
• For all values n at and to the right of n0, the value of the function f(n) is on or below cg(n).
• We write f(n) =O(g(n)), to indicate that a function f(n) is a member of the set O(g(n)).
• O-notation to give an upper bound on a function, cg(n) is the upper pound of f(n).
(Big-OH) notation
• When we say “the running time is O(n2)” what does it mean?
• we mean that there is a function g(n) that grows like O(n2)such that for any value of n, no matter what particular input of size n is chosen, the running time on that input is bounded from above by the value g(n). Equivalently, we mean that the worst-case running time is O(n2).
Big-OH notation Examples
• The following function are O(n2):
• f(n) = n2 + n
• f(n) = an2+ b
• f(n) = bn + c
• f(n) = an
• f(n) = n2
• f(n) = log(n)
• But not:
• f(n) = n3
• f(n) = n4+ logn
(Big-OH) notation
• Using O notation, we can often describe the running time of an algorithm merely by inspecting the algorithm’s overall structure.
• For example, the nested loop structures of the insertion sort algorithm immediatelyyields an O(n2)upper bound on the worst case running time. Why?
o-notation
• The asymptotic upper bound provided by O-notation may or may not be asymptotically tight, e.g. The bound 2n2 = O(n2) is asymptotically tight, but the bound 2n = O(n2) is not.
• We use o-notation to denote an upper bound that is not asymptotically tight, For example, 2n = o(n2), but 2n2≠ o(n2).
• We formally define o(g(n)) (“little-oh of g of n”) as the set:
• The definitions of O-notation and o-notation are similar. The main difference is that in f(n) = O(g(n)), the bound 0 ≤ f(n) ≤cg(n)holds for some constant c > 0,
• But in f(n) = o(g(n)), the bound 0 ≤ f(n) < cg(n)holds for all constants c > 0.
o-notation
• Intuitively, in o-notation, the function f(n)becomes insignificant relative to g(n)as n approaches infinity;that is:
Little-OH notation Example
• The following function are o(n2):
• f(n) = bn + c
• f(n) = a
• f(n) = log(n)
• But not:
• f(n) = n2
• f(n) = n4 + logn
Ω-notation
• O-notation provides an asymptotic upper bound on a function, Ω–notation provides an asymptotic lower bound.
• For a given function g(n), we denote by Ω(g(n)) (pronounced “big-omega of g of n” or sometimes just “omega of g of n”) the set of functions:
• Ω -notation gives a lower bound for a function.
• We write f(n)=Ω(g(n)) if there are positive constants n0 and c such that at and to the right of n0, the value of f(n)always lies on or above cg(n).
Ω-notation
• we say that the running time of an algorithm is Ω(n2), what does it mean?
• we mean that no matter what particular input of size n is chosen for each value of n, the running time on that input is at least a constant times n2, for sufficiently large n.
Ω-notation
• Using Big-Omega we are giving a lower bound on the best-case running time of an algorithm. For example, the best-case running time of insertion sort is Ω(n), which implies that the running time of insertion sort is Ω(n).
• The running time of insertion sort belongs to both Ω(n) and O(n2).
ω-notation
• ω-notation is to Ω -notation as o-notation is to O-notation.
• We use ω-notation to denote a lower bound that is not asymptotically tight.
• Formally, however, we define ω(g(n)) (“little-omega of g of n”) as the set:
• For example, n2/2 = ω(n), but n2/2 ≠ ω(n).
ω-notation
• The relation f(n) = ω(g(n)) implies that
• if the limit exists. That is, f(n)becomes large relative to g(n)as n approaches infinity
 Notation
•  Notation
• For a given function g(n) we denote by (g(n)) the set of functions:
• We write f(n)= (g(n)) : if there exist positive constants n0, c1, and c2 such that at and to the right of n0, the value of f(n), always lies between c1.g(n) and c2.g(n) inclusive.
• In other words, for all n n0, the function f (n) is equal to g(n).
• We say that g(n) is an asymptotically tight bound for f(n).