1 / 18

Growth of Functions

Growth of Functions. Asymptotic Notation. When we look at a large enough input n , we are studying the asymptotic efficiency Concerned with the running time as the input increases without bound

lease
Download Presentation

Growth of Functions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Growth of Functions Jeff Chastine

  2. Asymptotic Notation • When we look at a large enough input n, we are studying the asymptotic efficiency • Concerned with the running time as the input increases without bound • Usually, an algorithm that is asymptotically more efficient is the best choice (except for small n) Jeff Chastine

  3. Θ-notation (‘Theta’) • Θ(g(n))={ƒ(n):there exist positive constants c1, c2, and n0 such that 0≤c1g(n) ≤ƒ(n) ≤c2g(n) for all n≥n0}. • This means that g(n) and f(n) have the same running time; are off by just a constant • "Sandwiched" for sufficiently large n • We say g(n) is an asymptotically tight bound for f(n) Jeff Chastine

  4. Big-Θ Jeff Chastine

  5. A Formal Definition ½n2 - 3n = Θ(n2) c1n2 ≤ ½n2 - 3n ≤ c2n2 for all n≥n0. Divide by n2 yields c1≤ ½ - 3/n ≤ c2 Jeff Chastine

  6. Why 6n3≠ Θ(n2) Suppose 6n3≤ c2n2 for all n≥n0 Divide each side by 6n2 n ≤ c2/6, which isn't true for all n! Divide each side by 6n3 1 ≤ c2/n, which isn't true for all n! Jeff Chastine

  7. Practice • Do this now! • Prove 3n2 + n = Θ (n2) Jeff Chastine

  8. Note • Now you can see: • Why constants (coefficients) don't matter • Why lesser terms don't matter d ∑aini = Θ(nd) i=0 Jeff Chastine

  9. Ο-notation (Big-Oh) • Ο(g(n))={ƒ(n):there exist positive constants c and n0 such that 0≤ƒ(n) ≤cg(n) for all n≥n0}. • This means that g(n) is always greater than f(n) at some point • We say g(n) is an asymptotic upper bound for f(n) • Associated with worst-case running time Jeff Chastine

  10. Big-Oh Jeff Chastine

  11. A Formal Definition 10n2 - 3n = Ο(n3) 10n2 - 3n ≤ c2n3 for all n≥n0. Divide by n2 yields 10 - 3/n ≤ c2n Jeff Chastine

  12. Ω-notation (‘Omega’) • Ω(g(n))={ƒ(n):there exist positive constants c and n0 such that 0≤cg(n) ≤ƒ(n) for all n≥n0}. • This means that g(n) is always less than f(n) at some point • We say g(n) is an asymptotic lower bound for f(n) • Associated with best-case running time Jeff Chastine

  13. Big-Ω Jeff Chastine

  14. Insertion Sort(revisited) • Running time falls between Ω(n) andΟ(n2) • Is the worst-case scenario of insertion sort Ω(n2) Jeff Chastine

  15. ο-notation (little-oh) • ο(g(n))={ƒ(n): for any positive constant c, there exists a constant n0 such that 0≤ƒ(n) ≤cg(n) for all n≥n0}. • Example: 2n = ο(n2), but 2n2 ≠ ο(n2) • This notation is not asymptotically tight Jeff Chastine

  16. ω-notation (little omega) • ω(g(n))={ƒ(n): for any positive constant c, there exists a constant n0 > 0 such that 0 ≤ cg(n) ≤ƒ(n) for all n≥n0}. • Example: n2/2 = ω(n), but n2/2 ≠ ω(n2) • This notation is not asymptotically tight Jeff Chastine

  17. Standard Notation • A function is monotonically increasing if m≤ n implies ƒ(m) ≤ ƒ(n) • There is also monotonically decreasing • The floor of a number rounds down x • The ceiling of a number rounds up x └ ┘ ┌ ┐ Jeff Chastine

  18. Jeff Chastine

More Related