# CS 3343: Analysis of Algorithms - PowerPoint PPT Presentation Download Presentation CS 3343: Analysis of Algorithms

CS 3343: Analysis of Algorithms Download Presentation ## CS 3343: Analysis of Algorithms

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
##### Presentation Transcript

1. CS 3343: Analysis of Algorithms Review for final

2. Final Exam • Closed book exam • Coverage: the whole semester • Cheat sheet: you are allowed one letter-size sheet, both sides • Monday, May 5, 9:45 – 12:15pm • Basic calculator (no graphing) allowed • No cell phones!

3. Final Exam: Study Tips • Study tips: • Study each lecture • Study the homework and homework solutions • Study the midterm exams • Re-make your previous cheat sheets

4. Topics covered (1) By reversed chronological order: • Graph algorithms • Representations • MST (Prim’s, Kruskal’s) • Shortest path (Dijkstra’s) • Running time analysis with different implementations • Greedy algorithm • Unit-profit restaurant location problem • Fractional knapsack problem • Prim’s and Kruskal’s are also examples of greedy algorithms • Greedy algorithm • Unit-profit restaurant location problem • Fractional knapsack problem • Prim’s and Kruskal’s are also examples of greedy algorithms • How to show that certain greedy choices are optimal

5. Topics covered (2) • Dynamic programming • LCS • Restaurant location problem • Shortest path problem on a grid • Other problems • How to define recurrence solution, and use dynamic programming to solve it • Binary heap and priority queue • Heapify, buildheap, insert, exatractMax, changeKey • Running time

6. Topics covered (3) • Order statistics • Rand-Select • Worst-case Linear-time selection • Running time analysis • Sorting algorithms • Insertion sort • Merge sort • Quick sort • Heap sort • Linear time sorting: counting sort, radix sort • Stability of sorting algorithms • Worst-case and expected running time analysis • Memory requirement of sorting algorithms

7. Topics covered (4) • Analysis • Order of growth • Asymptotic notation, basic definition • Limit method • L’ Hopital’s rule • Stirling’s formula • Best case, worst case, average case • Analyzing non-recursive algorithms • Arithmetic series • Geometric series • Analyzing recursive algorithms • Defining recurrence • Solving recurrence • Recursion tree (iteration) method • Substitution method • Master theorem

8. Review for finals • In chronological order • Only the more important concepts • Very likely to appear in your final • Does not mean to be exclusive

9. Asymptotic notations • O: Big-Oh • Ω: Big-Omega • Θ: Theta • o: Small-oh • ω: Small-omega • Intuitively: O is like  o is like <  is like   is like >  is like =

10. Big-Oh • Math: • O(g(n)) = {f(n):  positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n)  n>n0} • Or: lim n→∞ g(n)/f(n) > 0 (if the limit exists.) • Engineering: • g(n) grows at least as faster as f(n) • g(n) is an asymptotic upper bound of f(n) • Intuitively it is like f(n) ≤ g(n)

11. Big-Oh • Claim: f(n) = 3n2 + 10n + 5  O(n2) • Proof: 3n2 + 10n + 5  3n2 + 10n2 + 5n2 when n >118 n2 when n >1 Therefore, • Let c = 18 and n0 = 1 • We have f(n)  c n2,  n > n0 • By definition, f(n)  O(n2)

12. Big-Omega • Math: • Ω(g(n)) = {f(n):  positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n)  n>n0} • Or: lim n→∞ f(n)/g(n) > 0 (if the limit exists.) • Engineering: • f(n) grows at least as faster as g(n) • g(n) is an asymptotic lower bound of f(n) • Intuitively it is like g(n) ≤ f(n)

13. Big-Omega • f(n) = n2 / 10 = Ω(n) • Proof: f(n) = n2 / 10, g(n) = n • g(n) = n ≤ n2 / 10 = f(n) when n > 10 • Therefore, c = 1 and n0 = 10

14. Theta • Math: • Θ(g(n)) = {f(n):  positive constants c1, c2, and n0 such that c1 g(n)  f(n)  c2 g(n)  n  n0  n>n0} • Or: lim n→∞ f(n)/g(n) = c > 0 and c < ∞ • Or: f(n) = O(g(n)) and f(n) = Ω(g(n)) • Engineering: • f(n) grows in the same order as g(n) • g(n) is an asymptotic tight bound of f(n) • Intuitively it is like f(n) = g(n) • Θ(1) means constant time.

15. Theta • Claim: f(n) = 2n2 + n = Θ (n2) • Proof: • We just need to find three constants c1, c2, and n0 such that • c1n2 ≤ 2n2+n ≤ c2n2 for all n > n0 • A simple solution is c1 = 2, c2 = 3, and n0 = 1

16. Using limits to compare orders of growth 0 • lim f(n) / g(n) = c > 0 ∞ f(n)  o(g(n)) f(n)  O(g(n)) f(n) Θ (g(n)) n→∞ f(n)  Ω(g(n)) f(n) ω (g(n))

17. Compare 2n and 3n • lim 2n / 3n = lim(2/3)n = 0 • Therefore, 2n o(3n), and 3nω(2n) n→∞ n→∞

18. L’ Hopital’s rule lim f(n) / g(n) = lim f(n)’ / g(n)’ If both lim f(n) and lim g(n) goes to ∞ n→∞ n→∞

19. Compare n0.5 and logn • lim n0.5 / logn = ? • (n0.5)’ = 0.5 n-0.5 • (log n)’ = 1 / n • lim (n-0.5 / 1/n) = lim(n0.5) = • Therefore, log n  o(n0.5) n→∞ ∞

20. Stirling’s formula (constant)

21. Compare 2n and n! • Therefore, 2n = o(n!)

23. General plan for analyzing time efficiency of a non-recursive algorithm • Decide parameter (input size) • Identify most executed line (basic operation) • worst-case = average-case? • T(n) = i ti • T(n) = Θ (f(n))

24. Analysis of insertion Sort Statement cost time__ InsertionSort(A, n) { for j = 2 to n {c1 n key = A[j] c2 (n-1) i = j - 1; c3 (n-1) while (i > 0) and (A[i] > key) { c4 S A[i+1] = A[i] c5 (S-(n-1)) i = i - 1 c6 (S-(n-1)) } 0 A[i+1] = key c7 (n-1) } 0 }

25. Best case • Array already sorted Inner loop stops when A[i] <= key, or i = 0 i j 1 Key sorted

26. Worst case • Array originally in reverse order Inner loop stops when A[i] <= key i j 1 Key sorted

27. Average case • Array in random order Inner loop stops when A[i] <= key i j 1 Key sorted

28. Find the order of growth for sums • How to find out the actual order of growth? • Remember some formulas • Learn how to guess and prove

29. Arithmetic series • An arithmetic series is a sequence of numbers such that the difference of any two successive members of the sequence is a constant. e.g.: 1, 2, 3, 4, 5 or 10, 12, 14, 16, 18, 20 • In general: Recursive definition Closed form, or explicit formula Or:

30. Sum of arithmetic series If a1, a2, …, an is an arithmetic series, then

31. Geometric series • A geometric series is a sequence of numbers such that the ratio between any two successive members of the sequence is a constant. e.g.: 1, 2, 4, 8, 16, 32 or 10, 20, 40, 80, 160 or 1, ½, ¼, 1/8, 1/16 • In general: Recursive definition Closed form, or explicit formula Or:

32. Sum of geometric series if r < 1 if r > 1 if r = 1

33. Important formulas

34. Sum manipulation rules Example:

35. Recursive algorithms • General idea: • Divide a large problem into smaller ones • By a constant ratio • By a constant or some variable • Solve each smaller onerecursively or explicitly • Combine the solutions of smaller ones to form a solution for the original problem Divide and Conquer

36. How to analyze the time-efficiency of a recursive algorithm? • Express the running time on input of size n as a function of the running time on smaller problems

37. Sloppiness:Should be T( n/2 ) + T( n/2) , but it turns out not to matter asymptotically. Analyzing merge sort T(n) Θ(1) 2T(n/2) f(n) MERGE-SORTA[1 . . n] • If n = 1, done. • Recursively sort A[ 1 . . n/2 ] and A[ n/2+1 . . n ] . • “Merge” the 2 sorted lists

38. Analyzing merge sort • Divide: Trivial. • Conquer: Recursively sort 2 subarrays. • Combine: Merge two sorted subarrays T(n) = 2T(n/2) + f(n) +Θ(1) # subproblems Work dividing and Combining subproblem size • What is the time for the base case? • What is f(n)? • What is the growth order of T(n)? Constant

39. Solving recurrence • Running time of many algorithms can be expressed in one of the following two recursive forms or Challenge: how to solve the recurrence to get a closed form, e.g. T(n) = Θ (n2) or T(n) = Θ(nlgn), or at least some bound such as T(n) = O(n2)?

40. Solving recurrence • Recurrence tree (iteration) method - Good for guessing an answer • Substitution method - Generic method, rigid, but may be hard • Master method - Easy to learn, useful in limited cases only - Some tricks may help in other cases

41. The master method The master method applies to recurrences of the form T(n) = aT(n/b) + f(n), where a³ 1, b > 1, and f is asymptotically positive. • Dividethe problem into a subproblems, each of size n/b • Conquer the subproblems by solving them recursively. • Combine subproblem solutions • Divide + combine takes f(n) time.

42. Master theorem T(n) = aT(n/b) + f(n) Key: compare f(n) with nlogba • CASE 1:f(n) = O(nlogba – e) T(n) = Q(nlogba) . • CASE 2:f(n) = Q(nlogba) T(n) = Q(nlogba log n) . • CASE 3:f(n) = W(nlogba + e) and af(n/b) £cf(n) •  T(n) = Q(f(n)) . • e.g.: merge sort: T(n) = 2 T(n/2) + Θ(n) • a = 2, b = 2  nlogba = n •  CASE 2  T(n) = Θ(n log n) .

43. Case 1 Compare f(n) with nlogba: f(n) = O(nlogba – e) for some constant e > 0. : f(n)grows polynomially slower than nlogba (by an ne factor). Solution:T(n) = Q(nlogba) i.e., aT(n/b) dominates e.g. T(n) = 2T(n/2) + 1 T(n) = 4 T(n/2) + n T(n) = 2T(n/2) + log n T(n) = 8T(n/2) + n2

44. Case 3 Compare f(n) with nlogba: f(n) = W(nlogba + e) for some constant e > 0. : f(n)grows polynomially faster than nlogba (by an ne factor). Solution:T(n) = Q(f(n)) i.e., f(n) dominates e.g. T(n) = T(n/2) + n T(n) = 2 T(n/2) + n2 T(n) = 4T(n/2) + n3 T(n) = 8T(n/2) + n4

45. Case 2 Compare f(n) with nlogba: f(n) = Q(nlogba). : f(n)and nlogba grow at similar rate. Solution:T(n) = Q(nlogba log n) e.g. T(n) = T(n/2) + 1 T(n) = 2 T(n/2) + n T(n) = 4T(n/2) + n2 T(n) = 8T(n/2) + n3

46. Recursion tree Solve T(n) = 2T(n/2) + dn, where d > 0 is constant.

47. Recursion tree Solve T(n) = 2T(n/2) + dn, where d > 0 is constant. T(n)

48. dn T(n/2) T(n/2) Recursion tree Solve T(n) = 2T(n/2) + dn, where d > 0 is constant.

49. dn dn/2 dn/2 T(n/4) T(n/4) T(n/4) T(n/4) Recursion tree Solve T(n) = 2T(n/2) + dn, where d > 0 is constant.

50. Recursion tree Solve T(n) = 2T(n/2) + dn, where d > 0 is constant. dn dn/2 dn/2 dn/4 dn/4 dn/4 dn/4 … Q(1)