1 / 33

Algorithm Efficiency

Algorithm Efficiency. CS 110: Data Structures and Algorithms First Semester, 2010-2011. Learning Objectives. To analyze the efficiency of algorithms in terms of counting operations and Big-Oh notation. Algorithm Efficiency.

Download Presentation

Algorithm Efficiency

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester, 2010-2011

  2. Learning Objectives • To analyze the efficiency of algorithms in terms of counting operations and Big-Oh notation

  3. Algorithm Efficiency • An algorithm should not use any more of the computer’s resources than necessary • Two options • Benchmarking • Analysis

  4. Algorithm Efficiency • Benchmarking • Measure execution time of an algorithm using System.currentTimeMillis() or other methods • Pitfalls • Limited set of test inputs – may not be indicative of running time on other inputs • Comparing two algorithms requires the same machine setup • Algorithm must first be implemented

  5. Algorithm Analysis • Define primitive operations • Assigning a value to a variable • Calling a method • Performing an arithmetic operation • Comparing two values • Indexing into an array • Following an object reference • Returning from a method

  6. Algorithm Analysis • Count the total number of operations: Algorithm arrayMax(A,n): Input: An array A storing n integers. Output: The maximum element in A. maxA[0] for i 1 to (n - 1) do if max < A[i] then max A[i] return max

  7. Algorithm Analysis • Some notes • The for statement implies assignments, comparisons, subtractions and increments • The statement max A[i] will sometimes not be carried out

  8. What to count? maxA[0] for i 1 to (n - 1) do if max < A[i] then max A[i] return max assignment, array access assignment, comparison, subtraction,increment (2 operations) comparison, array access assignment, array access return

  9. Algorithm Analysis • Counting operations: maxA[0] for i 1 to (n - 1) do if max < A[i] then max A[i] return max Running time (worst case) Running time (average case) 2 1 + 2n + 2(n-1) 2(n-1) 0 … 2(n-1) 1 8n-2 6n

  10. Algorithm Analysis • Exercise: What if there are nested loops?for i 1 to n do for j  1 to n do print i, j • Note: inner loop gets executed n times

  11. Worst Case vs. Average Case • Worst Case: maximum number of operations executed • Average Case: average number of operations on a typical run • Need to define range of inputs • Need to find out probability distribution on range of input

  12. Algorithm Analysis • Considerations for Counting Primitive Steps • Implicit Operations in the Execution Path • Worst-case vs average-case vs best-case • Arbitrariness of Measurement • Compare algorithms by looking at growth rates • Linear vs polynomial vs exponential • Goal • To simplify analysis by getting rid of irrelevant information

  13. Asymptotic Behavior • How important is the exact number of primitive operations? • Example: arrayMax • It’s enough to say: “The running time of arrayMax grows proportionally to n”

  14. Big-Oh Notation • Big-Oh notation provides a way to compare two functions • “f(n) is O(g(n))” means:f(n) is less than or equal to g(n) up to a constant factor for large values of n

  15. Formal Definition of Big-Oh • Let f(n) and g(n) be functions of the running times of algorithms F and G respectively • f(n) is O(g(n)) or “f(n) is Big-Oh of g(n)”if there exists: • a real constant c > 0 • an integer constant n0 ≥ 1 such that • f(n) ≤ c • g(n) for all n ≥ n0

  16. Example • f(n) = 2n + 5 and g(n) = n • Consider the condition 2n + 5 ≤ nWill this condition ever hold? No! • Suppose we multiply a constant to n 2n + 5 ≤ 3nThe condition holds for values of n ≥ 5 • Thus, we can select c = 3 and n0 = 5

  17. Example

  18. Big-Oh Notation • 6n - 3 is O(n) • c = 6 • 6n - 3 ≤ 6n for all n ≥ 1 • 3n2 + 2n is O(n2) • c = 4 • Is there an n0 such that 3n2 + 2n ≤ 4n2 for all n ≥ n0? • 4n3 + 8n2 + 2 is O(n4)

  19. Big-Oh Notation • na ∈O(nb) whenever a ≤ b • Suppose there is a c such that: • na ≤ cnb ⇔1 ≤ cnb-a • Since 1 ≤ a ≤ b, b – a ≥ 0 • If b = a then, 1 ≤ cnb-a ⇒1 ≤ cn0 ⇒1 ≤ c • So 1 ≤ cnd (where d≥0) will always be true for all n ≥ 1 and c ≥ 1

  20. Big-Oh Notation • Usually to prove that f(n) ∈ O(g(n)) we give a c that can work then solve for n0 • However, this usually involves factorization of polynomials for which is hard for degrees greater than 2. • An alternative is to prove by giving an n0 then solving for c.

  21. Big-Oh Notation • 3n2 + 26n + 34 ∈ O(n2) • Let n0 = 1 so n ≥ 1. Thus: • n2 ≥ n2, n2 ≥ n, n2 ≥ 1 • 3n2 ≥ 3n2, 26n2 ≥ 26n, 34n2 ≥ 34 • Adding all terms: • 3n2 + 26n2 + 34n2 ≥ 3n2 + 26n + 34 • 63n2 ≥ 3n2 + 26n + 34 • So c = 63

  22. Big-Oh Notation • n2 ∉ O(n) • Proof by contradiction. • Suppose ∃c > 0 and n0 ≥1 such that n2≤cn for all n ≥ n0 • n2≤cn ⇒n ≤ c (since n > 0, we can safely divide) • This implies that, n0≤ n ≤ c for n2≤cn which contradicts our definition.

  23. Big-Oh Notation • 3n ∉ O(2n) • Suppose 3n ∈ O(2n) then 3n ≤ c2n, for some c > 0 and n ≥ n0 ≥1 • Note that log is a monotonically increasing function so • 0 < a ≤ b if and only if log(a) ≤ log(b) • log 3n ≤ log c2n ⇒ n log 3 ≤ log c + n log 2 • n (log 3 – log 2) ≤ log c

  24. Big-Oh Notation • 3n ∉ O(2n) • n (log 3 – log 2) ≤ log c • To make the inequality easier to read, let • b = log 3 – log 2 (note that b > 0) • a = log c • Thus, nb ≤ a which is a contradiction since n cannot have an upper bound.

  25. Big-Oh Properties/Identities • g(O(f(n))) = { g(h(n)) for all h(n) in O(f(n)) } • c⋅O(f(n)) = O(f(n)) • O(O(f(n))) = O(f(n)) • O(f(n))⋅O(g(n)) = O(f(n)⋅g(n)) • f(n)⋅O(g(n)) = O(f(n)⋅g(n)) • O(f(n)) + O(g(n)) = O(|f(n)| + |g(n)|)

  26. Big-Oh Notation • Big-Oh allows us to ignore constant factors and lower order (or less dominant) terms • Rule: Drop lower order terms and constant factors • 5n + 2 is O(n) • 4n3 log n + 6n3 + 1 is O(n3 log n) • Allows us to classify functions into categories

  27. Function Categories • The constant function: f(n) = 1 • The linear function: f(n) = n • The quadratic function: f(n) = n2 • The cubic function: f(n) = n3 • The exponential function: f(n) = 2n • The logarithm function: f(n) = log n • The n log n function: f(n) = n log n

  28. Comparing Function Categories • Linear (n) is better than quadratic (n2) which is better than exponential (2n) • Are there any function categories better than linear? Yes! • Constant (1) • Logarithmic (log n) • “Better” means resulting values are smaller (slower growth rates)

  29. Functions by IncreasingGrowth Rate • The constant function: f(n) = 1 • The logarithm function: f(n) = log n • The linear function: f(n) = n • The n log n function: f(n) = n log n • The quadratic function: f(n) = n2 • The cubic function: f(n) = n3 • The exponential function: f(n) = 2n

  30. Big-Oh in this Course • For this course, you will be expected to assess the running time of an algorithm and classify it under one of the categories, using Big-Oh notation • You should be able to recognize, for instance, that, most of the time (not always): • Algorithms with single loops are O(n) • Algorithms with double-nested loops are O(n2)

  31. Big-Oh as an Upper Bound • The statement "f(n) is O( g(n) )" indicates that g(n) is an upper bound for f(n) • Which means it is also correct to make statements like: • 3n+5 is O(n2) • 3n+5 is O(2n) • 3n+5 is O(5n + log n - 2) • But the statement 3n+5 is O(n) is the “tightest” statement one can make

  32. Other Ways of Analysis • Aside from Big-Oh, there are other ways of analyzing algorithm running time • Big-Omega Ω(g(n)) • Specifies an asymptotic lower bound rather than an upper bound • Big-Theta Θ(g(n)) • Specifies both asymptotic upper and lower bounds. i.e. f(n)∈Θ(g(n)) if f(n)∈O(g(n)) and f(n)∈Ω(g(n))

  33. Big-Omega and Big-Theta • 6n - 3 is Ω(n) • c = 3 • 6n - 3 ≥ 3n for all n ≥ 1 • 3n2 + 2n is Θ(n2) • c = 3 • 3n2 + 2n ≥ 3n2 for all n ≥ 1 • Therefore, it is Ω(n2) • 4n3 + 8n2 + 2 is notΘ(n4)

More Related