1 / 26

CSE 326 Asymptotic Analysis

CSE 326 Asymptotic Analysis. David Kaplan Dept of Computer Science & Engineering Autumn 2001. Housekeeping. Homework 1 status Join cse326@cs Poll: Slide format(s)? Office hours today?. Analyzing Algorithms. Analyze algorithms to gauge: Time complexity (running time)

yestin
Download Presentation

CSE 326 Asymptotic Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSE 326Asymptotic Analysis David Kaplan Dept of Computer Science & Engineering Autumn 2001

  2. Housekeeping • Homework 1 status • Join cse326@cs • Poll: • Slide format(s)? • Office hours today? CSE 326 Autumn 2001 2

  3. Analyzing Algorithms Analyze algorithms to gauge: • Time complexity (running time) • Space complexity (memory use) Input size is indicated by a number n • sometimes have multiple inputs, e.g. m and n Running time is a function of n n, n2, n log n, 18 + 3n(log n2) + 5n3 CSE 326 Autumn 2001 3

  4. RAM Model RAM (random access machine) • Ideal single-processor machine (serialized operations) • “Standard” instruction set (load, add, store, etc.) • All operations take 1 time unit (including, for our purposes, each C++ statement) CSE 326 Autumn 2001 4

  5. Order Notation (aka Big-O) CSE 326 Autumn 2001 5

  6. Simplifying with Big-O By definition, Big-O allows us to: Eliminate low order terms • 4n + 5  4n • 0.5 n log n - 2n + 7  0.5 n log n Eliminate constant coefficients • 4n  n • 0.5 n log n  n log n • log n2 = 2 log n  log n • log3 n = (log3 2) log n  log n But when might constants or low-order terms matter? CSE 326 Autumn 2001 6

  7. Big-O Examples n2 + 100 n = O(n2) follows from … ( n2 + 100 n )  2 n2 for n  10 n2 + 100 n = (n2) follows from …( n2 + 100 n )  1 n2 for n  0 n2 + 100 n = (n2) by definition n log n = O(n2) n log n = (n log n) n log n = (n) CSE 326 Autumn 2001 7

  8. Big-O Usage Order notation is not symmetric: • we can say 2n2 + 4n = O(n2) • … but never O(n2) = 2n2 + 4n Right-hand side is a crudification of the left Order expressions on left can produce unusual-looking, but true, statements: O(n2) = O(n3) (n3) = (n2) CSE 326 Autumn 2001 8

  9. Function A n3 + 2n2 n0.1 n + 100n0.1 5n5 n-152n/100 82log n Function #2 100n2 + 1000 log n 2n + 10 log n n! 1000n15 3n7 + 7n Big-O Comparisons vs. CSE 326 Autumn 2001 9

  10. Race 1 vs. 100n2 + 1000 n3 + 2n2 CSE 326 Autumn 2001 10

  11. Race 2 n0.1 vs. log n In this one, crossover point is very late! So, which algorithm is really better??? CSE 326 Autumn 2001 11

  12. Race C n + 100n0.1 vs. 2n + 10 log n Is the “better” algorithm asymptotically better??? CSE 326 Autumn 2001 12

  13. Race 4 5n5 vs. n! CSE 326 Autumn 2001 13

  14. Race 5 n-152n/100 vs. 1000n15 CSE 326 Autumn 2001 14

  15. Race VI 82log(n) vs. 3n7 + 7n CSE 326 Autumn 2001 15

  16. Function A n3 + 2n2 n0.1 n + 100n0.1 5n5 n-152n/100 82log n Function #2 100n2 + 1000 log n 2n + 10 log n n! 1000n15 3n7 + 7n Big-O Winners (i.e. losers) • Winner • O(n2) • O(log n) • O(n) TIE • O(n5) • O(n15) • O(n6)why??? vs. CSE 326 Autumn 2001 16

  17. Big-O Common Names constant: O(1) logarithmic: O(log n) linear: O(n) log-linear: O(n log n) superlinear: O(n1+c) (c is a constant > 0) quadratic: O(n2) polynomial: O(nk) (k is a constant) exponential: O(cn) (c is a constant > 1) CSE 326 Autumn 2001 17

  18. Kinds of Analysis Running time may depend on actual input, not just length of input Distinguish • Worst case • Your worst enemy is choosing input • Average case • Assume probability distribution of inputs • Amortized • Average time over many runs • Best case (not too useful) CSE 326 Autumn 2001 18

  19. Analyzing Code C++ operations Consecutive stmts Conditionals Loops Function calls Recursive functions constant time sum of times larger branch plus test sum of iterations cost of function body solve recursive equation CSE 326 Autumn 2001 19

  20. Nested Loops for i = 1 to n do for j = 1 to n do sum = sum + 1 CSE 326 Autumn 2001 20

  21. Dependent Nested Loops for i = 1 to n do for j = i to n do sum = sum + 1 CSE 326 Autumn 2001 21

  22. Recursion • A recursive procedure can often be analyzed by solving a recursive equation • Basic form: T(n) = base case: some constant recursive case: T(subproblems) + T(combine) • Result depends upon • how many subproblems • how much smaller are subproblems • how costly to combine solutions (coefficients) CSE 326 Autumn 2001 22

  23. Sum of Queue SumQueue(Q) if (Q.length == 0 ) return 0 else return Q.dequeue() + SumQueue(Q) One subproblem Linear reduction in size (decrease by 1) Combining: constant (cost of 1 add) T(0)  b T(n)  c + T(n – 1) for n>0 CSE 326 Autumn 2001 23

  24. Sum of Queue Solution Equation: T(0)  b T(n)  c + T(n – 1) for n>0 Solution: T(n)  c + c + T(n-2)  c + c + c + T(n-3)  kc + T(n-k) for all k  nc + T(0) for k=n  cn + b = O(n) CSE 326 Autumn 2001 24

  25. Binary Search BinarySearch(A, x) Search A, a sorted array, for item x 7 12 30 35 75 83 87 90 97 99 One subproblem, half as large Equation: T(1)  b T(n)  T(n/2) + c for n>1 CSE 326 Autumn 2001 25

  26. Binary Search: Solution Equation: T(1)  b T(n)  T(n/2) + c for n>1 Solution: T(n)  T(n/2) + c T(n/4) + c + c T(n/8) + c + c + c T(n/2k) + kc T(1) + c log n where k = log n b + c log n = O(log n) CSE 326 Autumn 2001 26

More Related