1 / 19

Growth Rates of Functions

Growth Rates of Functions. Asymptotic Equivalence. Def: For example, Note that n 2 +1 is being used to name the function f such that f(n ) = n 2 +1 for every n. An example: Stirling’s formula. Little-Oh: f = o(g ). For example, n 2 = o( n 3 ) since.

kyrene
Download Presentation

Growth Rates of Functions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Growth Rates of Functions

  2. Asymptotic Equivalence Def: For example, Note that n2+1 is being used to name the function f such that f(n) = n2+1 for every n

  3. An example: Stirling’sformula

  4. Little-Oh: f = o(g) • For example, n2 = o(n3) since Def: f(n) = o(g(n))iff

  5. = o( ∙ ) is “all one symbol” “f = o(g)” is really a strict partial order on functions NEVER write “o(g) = f”, etc.

  6. Big-Oh: O(∙) Asymptotic Order of Growth: “f grows no faster than g” A Weak Partial Order

  7. Growth Order

  8. f = o(g) implies f = O(g)

  9. Big-Omega f = Ω(g) means g = O(f) “f grows at least as quickly as g”

  10. Big-Theta: 𝛩(∙)“Same order of growth”

  11. Rough Paraphrase f∼g: f and g grow to be roughly equal f=o(g): f grows more slowly than g f=O(g): f grows at most as quickly as g f=Ω(g): f grows at least as quickly as g f=𝛩(g): f and g grow at the same rate

  12. Equivalent Defn of O(∙) “From some point on, the value of f is at most a constant multiple of the value of g”

  13. Three Concrete Examples Polynomials Logarithmic functions Exponential functions

  14. Polynomials • A (univariate) polynomial is a function such as f(n) = 3n5+2n2-n+2 (for all natural numbers n) • This is a polynomial of degree 5 (the largest exponent) • Or in general • Theorem: • If a<b then any polynomial of degree a is o(any polynomial of degree b) • If a≤b then any polynomial of degree a is O(any polynomial of degree b)

  15. Logarithmic Functions A function f is logarithmic if it is Θ(logbn) for some constant b. Theorem: All logarithmic functions are Θ() of each other, and are Θ(any logarithmic function of a polynomial) Theorem: Any logarithmic function is o(any polynomial)

  16. Exponential Functions A function is exponential if it is Θ(cn) for some constant c>1. Theorem: Any polynomial is o(any exponential) If c<d then cn=o(dn).

  17. Growth Rates and Analysis of Algorithms Let f(n) measure the amount of time taken by an algorithm to solve a problem of size n. Most practical algorithms have polynomial running times E.g. sorting algorithms generally have running times that are quadratic (polynomial or degree 2) or less (for example, O(n log n)). Exhaustive search over an exponentially growing set of possible answers requires exponential time.

  18. Another way to look at it Suppose an algorithm can solve a problem of size S in time T and you give it twice as much time. If the running time is f(n)=n2, so that T=S2, then in time 2T you can solve a problem of size (21/2)∙S If the running time is f(n)=2n, so that T=2S, then in time 2T you can solve a problem of size S+1. In general doubling the time available to a polynomial algorithm results in a MULTIPLICATIVE increase in the size of the problem that can be solved But doubling the time available to an exponential algorithm results in an ADDITIVE increase to the size of the problem that can be solved.

  19. FINIS

More Related