1 / 26

Chapter Two Algorithm Analysis

Chapter Two Algorithm Analysis. Empirical vs. theoretical Space vs. time Worst case vs. Average case Upper, lower, or tight bound Determining the runtime of programs What about recursive programs?. What’s the runtime?. int n; cin >> n; for (int i=0; i<n; i++) for (int j=0; j<n; j++)

enrico
Download Presentation

Chapter Two Algorithm Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter TwoAlgorithm Analysis • Empirical vs. theoretical • Space vs. time • Worst case vs. Average case • Upper, lower, or tight bound • Determining the runtime of programs • What about recursive programs?

  2. What’s the runtime? int n; cin >> n; for (int i=0; i<n; i++) for (int j=0; j<n; j++) for (int k=0; k<n; k++) cout << “Hello world!\n”; 2n3+n2+n+2? O(n3) runtime What if the last line is replaced by: string *s=new string(“Hello world!\n”); O(n3) time and space

  3. Resource Analysis • Runtime: we’d like to count the steps – but that would be machine dependent • Space: we may also be interested in space usage  ignore constant factors, use O() notation  count steps equivalent to machine language instructions  count the bytes used

  4. Asymptotic notation • g(n) is said to be O(f(n)) if there exist constants cand n0 such that g(n) < c f(n) for all n > n0 • g(n) is said to be W(f(n)) if there exist positive constants cand n0 such that 0 <= c f(n) < g(n) for all n > n0 • g(n) is said to be Q(f(n)) if g(n) = O(f(n)) and g(n) = W(f(n)) • O: like <= for functions (asymptotically speaking) • W: like >= • Q: like =  for all n > n0  ignore constant factors, lower order terms

  5. Asymptotic notation: examples • Asymptotic runtime, in terms of O, W, Q? • Suppose the runtime for a function is • n2 + 2n log n + 40 • 0.0000001 n2+ 1000000n1.999 • n3 + n2 log n • n2.0001 + n2 log n • 2n+ 100 n2 • 1.00001n+ 100 n97

  6. Asymptotic comparisons • 0.0000001 n2 = O(1000000n1.999 )? • n1.000001 = O(n log n)? • 1.0001n = O(n943)? • lg n = Q(ln n)? (Compare the limit of the quotient of the functions) No – a polynomial with a higher power dominates one with a lower power No – all polynomials (n.000001) dominate any polylog (log n) No – all exponentials dominate any polynomial Yes – different bases are just a constant factor difference

  7. What’s the runtime? int n; cin >> n; for (int i=0; i<n; i++) for (int j=0; j<n; j++) for (int k=0; k<n; k++) cout << “Hello world!\n”; for (int i=0; i<n; i++) for (int j=0; j<n; j++) for (int k=0; k<n; k++) cout << “Hello world!\n”; Q(n3) + Q(n3) = Q(n3) Statements or blocks in sequence: add

  8. What’s the runtime? int n; cin >> n; for (int i=0; i<n; i++) for (int j=n; j>1; j/=2) cout << “Hello world!\n”; Loops: add up cost of each iteration (multiply loop cost by number of iterations if they all take the same time) log n iterations of n steps  Q(n log n)

  9. What’s the runtime? int n; cin >> n; for (int i=0; i<n; i++) for (int j=0; j<i; j++) cout << “Hello world!\n”; Loops: add up cost of each iteration 1 + 2 + 3 + … + n = n(n+1)/2 = O(n2)

  10. What’s the runtime? template <class Item> void insert(Item a[], int l, int r) { int i; for (i=r; i>l; i--) compexch(a[i-1],a[i]); for (i=l+2; i<=r; i++) { int j=i; Item v=a[i]; while (v<a[j-1]) { a[j] = a[j-1]; j--; } a[j] = v; } }

  11. What’s the runtime? void myst(int n) { if (n<100) for (int i=0; i<n; i++) for (int j=0; j<n; j++) for (int k=0; k<n; k++) cout << “Hello world!\n”; else for (int i=0; i<n; i++) for (int j=0; j<n; j++ cout << “Hello world!\n”; }

  12. Estimate the runtime • Suppose an algorithm has runtime Q(n3) • suppose solving a problem of size 1000 takes 10 seconds. How long to solve a problem of size 10000? • Suppose an algorithm has runtime Q(n log n) • suppose solving a problem of size 1000 takes 10 seconds. How long to solve a problem of size 10000? runtime 10-8 n3; if n=10000, runtime 10000s = 2.7hr runtime 10-3 n lg n; if n=10000, runtime 133 secs

  13. Worst vs. average case • You might be interested in worst, best, or average case analysis of an algorithm • You can have upper, lower, or tight bounds on each of those functions. • Eg. For each n, some problem instances of size n have runtime n and some have runtime n2. • Worst case: • Best case: • Average case: Q(n2), W(n), W(log n), O(n2), O(n3) W(n), W(log n), O(n2), Q(n) W(n), W(log n), O(n2), O(n3) Average case: need to know distribution of inputs

  14. The Taxpayer Problem • Tax time is coming up. The IRS needs to process tax forms. How to access and update each taxpayer’s info? ADT? • ADT Dictionary: find(x), insert(x), delete(x) • Implementation?

  15. Array Implementation • Insert(x): • Find(k): • Delete(I): Time for n Operations? Records[numRecs++] = x; Runtime: O(1) O(n2) For (I=0; I<numRecs; I++) if (records[I].key == k) return I; Runtime: O(n) records[I]=records[--numRecs]; Runtime: O(1)

  16. Sorted Array Implementation • Find(x): • Runtime? int bot=1, top=numRecs-1, mid; while (bot <= top) { mid = (bot + top)/2; if (data[mid]==x) return mid; if (data[mid]<x) top=mid-1; else bot=mid+1; } return –1;

  17. Analysis of Binary Search • How many steps to search among n items? • Number of items eliminated at each step? • Definition of lg(x)? • Runtime? O(log n)

  18. Sorted Array, cont. • Insert(x)? • Delete(x)? • Time for n insert, delete, and find ops? O(n) O(n) O(n2)

  19. Which implementation is better? find(x) insert(x) delete(x) Array S. Array Worst case for n operations? Array: Sorted Array: O(n) O(1) O(1) O(log n) O(n) O(n) O(n2) O(n2) What if some operations are more frequent than others?

  20. Molecule viewer example • Java demos: molecule viewer • Example1 • Example2 • Example3

  21. Molecule Viewer Source Snippet /* * I use a bubble sort since from one iteration to the next, the sort * order is pretty stable, so I just use what I had last time as a * "guess" of the sorted order. With luck, this reduces O(N log N) * to O(N) */ for (int i = nvert - 1; --i >= 0;) { boolean flipped = false; for (int j = 0; j <= i; j++) { int a = zs[j]; int b = zs[j + 1]; if (v[a + 2] > v[b + 2]) { zs[j + 1] = a; zs[j] = b; flipped = true; } } if (!flipped) break; }

  22. Merge sort runtime? void mergesort(first, last) { if (last-first >= 1) { mid=(last-first)/2 + first; mergesort(first, mid); mergesort(mid+1,last); merge(first, mid, last); } } T(n) = 2T(n/2) + c n T(1) = b; Called a recurrence relation

  23. Recurrence relations In Discrete Math: you’ll learn how to solve these. In this class: we’ll say “Look it up.” But you will be responsible for knowing how to write down a recurrence relation for the runtime of a program. Divide-and-conquer algorithms like merge sort that divide problem size by 2 and use O(n) time to conquer T(n) = 2T(n/2) + c n have runtime O(n log n)

  24. Hanoi runtime? void hanoi(n, from, to, spare { if (n > 0) { hanoi(n-1,from,spare,to); cout << from << “ – “ << to << endl; hanoi(n-1,spare,to,from); } } T(n) = 2T(n-1) + c T(0) = b Look it up: T(n) = O(2n)

  25. Hanoi recurrence solution T(n)=2T(n-1)+c T(n-1) = 2T(n-2) + c T(n-2) = 2T(n-3) + c _______ T(n) = 2T(n-1) + c = 2 [ 2T(n-2) + c ] + c = 22 T(n-2) + 2 c + c = 23 T(n-3) + 22 c + 21 c + 20 c … = 2k T(n-k) + 2k-1 c + 2k-2 c + … + 21 c + 20 c = 2k T(n-k) + c(2k – 1) Done when n-k=0 since we know T(0). T(n) = 2n b + c 2n - c = Q(2n)

  26. Binary Search recurrence? Recurrence relation? T(n)=T(n/2)+c; T(1) = b Look it up: Q(log n)

More Related