1 / 23

Introduction to Algorithms (2 nd edition)

Introduction to Algorithms (2 nd edition). by Cormen, Leiserson, Rivest & Stein Chapter 3: Growth of Functions (slides enhanced by N. Adlai A. DePano). Overview. Order of growth of functions provides a simple characterization of efficiency

adli
Download Presentation

Introduction to Algorithms (2 nd edition)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Algorithms(2nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 3: Growth of Functions (slides enhanced by N. Adlai A. DePano)

  2. Overview • Order of growth of functions provides a simple characterization of efficiency • Allows for comparison of relative performance between alternative algorithms • Concerned with asymptotic efficiency of algorithms • Best asymptotic efficiency usually is best choice except for smaller inputs • Several standard methods to simplify asymptotic analysis of algorithms

  3. Asymptotic Notation • Applies to functions whose domains are the set of natural numbers:N = {0,1,2,…} • If time resource T(n) is being analyzed, the function’s range is usually the set of non-negative real numbers:T(n)R+ • If space resource S(n) is being analyzed, the function’s range is usually also the set of natural numbers:S(n)N

  4. Asymptotic Notation • Depending on the textbook, asymptotic categories may be expressed in terms of -- • set membership (our textbook): functions belong to a family of functions that exhibit some property; or • function property (other textbooks): functions exhibit the property • Caveat: we will formally use (a) and informally use (b)

  5. c2⋅g f c1⋅g n0 The Θ-Notation Θ(g(n))= { f(n) :∃c1, c2 > 0, n0 > 0 s.t. ∀n ≥ n0: c1 · g(n) ≤ f(n) ≤ c2⋅g(n) }

  6. c⋅g f n0 The O-Notation O(g(n))= {f(n): ∃c > 0, n0 > 0 s.t. ∀n ≥ n0: f(n) ≤ c⋅g(n) }

  7. f c⋅g n0 TheΩ-Notation Ω(g(n))= { f(n): ∃c > 0, n0 > 0 s.t. ∀n ≥ n0: f(n) ≥ c⋅g(n) }

  8. c3⋅g c2⋅g c1⋅g f n1 n3 n2 The o-Notation o(g(n))= {f(n): ∀c > 0 ∃n0 > 0 s.t. ∀n ≥ n0: f(n) ≤ c⋅g(n) }

  9. f c3⋅g c2⋅g c1⋅g n3 n2 n1 Theω-Notation ω(g(n))= { f(n): ∀c > 0 ∃n0 > 0 s.t. ∀n ≥ n0: f(n) ≥ c⋅g(n) }

  10. Comparison of Functions • f(n) = O(g(n)) and g(n) = O(h(n)) ⇒ f(n) = O(h(n)) • f(n) = Ω(g(n)) and g(n) = Ω(h(n)) ⇒ f(n) = Ω(h(n)) • f(n) = Θ(g(n)) and g(n) = Θ(h(n)) ⇒ f(n) = Θ(h(n)) • f(n) = O(f(n)) f(n) = Ω(f(n)) f(n) = Θ(f(n)) Transitivity Reflexivity

  11. Comparison of Functions • f(n) = Θ(g(n)) ⇐⇒ g(n) = Θ(f(n)) • f(n) = O(g(n)) ⇐⇒ g(n) = Ω(f(n)) • f(n) = o(g(n)) ⇐⇒g(n) = ω(f(n)) • f(n) = O(g(n)) and f(n) = Ω(g(n)) ⇒ f(n) = Θ(g(n)) Symmetry Transpose Symmetry Theorem 3.1

  12. Asymptotic Analysis and Limits

  13. Comparison of Functions • f1(n) = O(g1(n)) and f2(n) = O(g2(n)) ⇒ f1(n) + f2(n) = O(g1(n) + g2(n)) • f(n) = O(g(n)) ⇒ f(n) + g(n) = O(g(n))

  14. Standard Notation and Common Functions • Monotonicity A function f(n) is monotonically increasing if m n implies f(m)f(n) . A function f(n) is monotonically decreasing if m n implies f(m)f(n) . A function f(n) is strictly increasingif m< n implies f(m)<f(n) . A function f(n) is strictly decreasingif m <n implies f(m)>f(n) .

  15. Standard Notation and Common Functions • Floors and ceilings For any real number x, the greatest integer less than or equal to x is denoted by x. For any real number x, the least integer greater than or equal to x is denoted by x. For all real numbers x, x1 < x x x< x+1. Both functions are monotonically increasing.

  16. Standard Notation and Common Functions • Exponentials For all n and a1, the function an is the exponential function with base a and is monotonically increasing. • Logarithms Textbook adopts the following conventionlg n = log2n (binary logarithm),ln n = logen (natural logarithm), lgkn = (lg n)k(exponentiation),lg lg n = lg(lg n)(composition),lg n + k = (lg n)+k (precedence of lg). ai

  17. Standard Notation and Common Functions • Important relationships For all real constants a and b such that a>1,nb = o(an)that is, any exponential function with a base strictly greater than unity grows faster than any polynomial function. For all real constants a and b such that a>0,lgbn = o(na)that is, any positive polynomial function grows faster than any polylogarithmic function.

  18. Standard Notation and Common Functions • Factorials For all n the function n! or “n factorial” is given byn! = n (n1) (n  2) (n 3) …  2  1 It can be established thatn! = o(nn) n! = (2n) lg(n!) = (nlgn)

  19. Standard Notation and Common Functions • Functional iteration The notation f(i)(n) represents the function f(n) iteratively applied i times to an initial value of n, or, recursivelyf(i)(n) = nif n=0f(i)(n) = f(f(i1)(n)) if n>0 Example: If f(n) = 2n then f(2)(n) = f(2n) = 2(2n) = 22n then f(3)(n) = f(f(2)(n)) = 2(22n) = 23n then f(i)(n) = 2in

  20. Standard Notation and Common Functions • Iterated logarithmic function The notation lg* n which reads “log star of n” is defined aslg* n = min {i0 : lg(i) n 1 Example:lg* 2 = 1 lg* 4 = 2 lg* 16 = 3 lg* 65536 = 4 lg* 265536 = 5

  21. Asymptotic Running Time of Algorithms • We consider algorithm A better than algorithm B ifTA(n) = o(TB(n)) • Why is it acceptable to ignore the behavior of algorithms for small inputs? • Why is it acceptable to ignore the constants? • What do we gain by using asymptotic notation?

  22. Things to Remember • Asymptotic analysis studies how the values of functions compare as their arguments grow without bounds. • Ignores constants and the behavior of the function for small arguments. • Acceptable because all algorithms are fast for small inputs and growth of running time is more important than constant factors.

  23. Things to Remember • Ignoring the usually unimportant details, we obtain a representation that succinctly describes the growth of a function as its argument grows and thus allows us to make comparisons between algorithms in terms of their efficiency.

More Related