1 / 52

COSC 3101A - Design and Analysis of Algorithms 2

COSC 3101A - Design and Analysis of Algorithms 2. Asymptotic Notations Continued Proof of Correctness: Loop Invariant Designing Algorithms: Divide and Conquer. Typical Running Time Functions. 1 (constant running time): Instructions are executed once or a few times logN (logarithmic)

Download Presentation

COSC 3101A - Design and Analysis of Algorithms 2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. COSC 3101A - Design and Analysis of Algorithms2 Asymptotic Notations Continued Proof of Correctness: Loop Invariant Designing Algorithms: Divide and Conquer

  2. Typical Running Time Functions • 1 (constant running time): • Instructions are executed once or a few times • logN (logarithmic) • A big problem is solved by cutting the original problem in smaller sizes, by a constant fraction at each step • N (linear) • A small amount of processing is done on each input element • N logN • A problem is solved by dividing it into smaller problems, solving them independently and combining the solution COSC3101A

  3. Typical Running Time Functions • N2 (quadratic) • Typical for algorithms that process all pairs of data items (double nested loops) • N3(cubic) • Processing of triples of data (triple nested loops) • NK (polynomial) • 2N (exponential) • Few exponential algorithms are appropriate for practical use COSC3101A

  4. Logarithms • In algorithm analysis we often use the notation “log n” without specifying the base Binary logarithm Natural logarithm COSC3101A

  5. Review: Asymptotic Notations(1) COSC3101A

  6. Review: Asymptotic Notations(2) if and only if if and only if COSC3101A

  7. Review: Asymptotic Notations(3) • A way to describe behavior of functions in the limit • How we indicate running times of algorithms • Describe the running time of an algorithm as n grows to  • O notation: asymptotic “less than”: f(n) “≤” g(n) •  notation: asymptotic “greater than”: f(n) “≥” g(n) •  notation: asymptotic “equality”: f(n) “=” g(n) COSC3101A

  8. Big-O Examples(1) • 2n2 = O(n3): • n2 = O(n2): • 1000n2+1000n = O(n2): • n = O(n2): 2n2≤ cn3  2 ≤ cn  c = 1 and n0= 2 n2≤ cn2  c ≥ 1  c = 1 and n0= 1 1000n2+1000n ≤ cn2 1000n+1000≤ cn  c=2000 and n0 = 1 n ≤ cn2  cn ≥ 1  c = 1 and n0= 1 COSC3101A

  9. Big-O Examples(2) • E.g.:prove that n2≠ O(n) • Assume  c & n0 such that:  n≥ n0: n2 ≤ cn • Choose n = max (n0, c) • n2 = n  n ≥ n c  n2 ≥ cn contradiction!!! COSC3101A

  10. More on Asymptotic Notations • There is no unique set of values for n0 and c in proving the asymptotic bounds • Prove that 100n + 5 = O(n2) • 100n + 5 ≤ 100n + n = 101n ≤ 101n2 for all n ≥ 5 n0 = 5 and c = 101is a solution • 100n + 5 ≤ 100n + 5n = 105n ≤ 105n2for all n ≥ 1 n0 = 1 and c = 105is also a solution Must findSOMEconstants c and n0 that satisfy the asymptotic notation relation COSC3101A

  11. Big- Examples • 5n2 = (n) • 100n + 5 ≠(n2) • n = (2n), n3 = (n2), n = (logn)  cn  5n2  c = 1 and n0 = 1 •  c, n0such that: 0  cn  5n2  c, n0 such that: 0  cn2  100n + 5 100n + 5  100n + 5n ( n  1) = 105n cn2  105n  n(cn – 105)  0  n  105/c Since n is positive  cn – 105  0  contradiction: n cannot be smaller than a constant COSC3101A

  12.  Examples • n2/2 –n/2 = (n2) • ½ n2 - ½ n ≤ ½ n2n ≥ 0  c2= ½ • ½ n2 - ½ n ≥ ½ n2 - ½ n * ½ n ( n ≥ 2 ) = ¼ n2 c1= ¼ • n ≠ (n2): c1 n2≤ n ≤ c2 n2 only holds for: n ≤ 1/c1 • 6n3 ≠ (n2): c1 n2≤ 6n3 ≤ c2 n2 only holds for: n ≤ c2 /6 • n ≠ (logn): c1logn≤ n ≤ c2 logn  c2 ≥ n/logn,  n≥ n0 – impossible COSC3101A

  13. Comparisons of Functions • Theorem: f(n) = (g(n))  f = O(g(n)) and f = (g(n)) • Transitivity: • f(n) = (g(n))andg(n) = (h(n))  f(n) = (h(n)) • Same for O and  • Reflexivity: • f(n) = (f(n)) • Same for O and  • Symmetry: • f(n) = (g(n)) if and only if g(n) = (f(n)) • Transpose symmetry: • f(n) = O(g(n)) if and only if g(n) = (f(n)) COSC3101A

  14. More Examples(1) • For each of the following pairs of functions, either f(n) is O(g(n)), f(n) is Ω(g(n)), or f(n) = Θ(g(n)). Determine which relationship is correct. • f(n) = log n2; g(n) = log n + 5 • f(n) = n; g(n) = log n2 • f(n) = log log n; g(n) = log n • f(n) = n; g(n) = log2 n • f(n) = n log n + n; g(n) = log n • f(n) = 10; g(n) = log 10 • f(n) = 2n; g(n) = 10n2 • f(n) = 2n; g(n) = 3n f(n) = (g(n)) f(n) = (g(n)) f(n) = O(g(n)) f(n) = (g(n)) f(n) = (g(n)) f(n) = (g(n)) f(n) = (g(n)) f(n) = O(g(n)) COSC3101A

  15. More Examples(2) •  notation • n2/2 – n/2 • (6n3 + 1)lgn/(n + 1) • n vs. n2 •  notation • n vs. 2n • n3 vs. n2 • n vs. logn • n vs. n2 = (n2) = (n2lgn) n ≠ (n2) • O notation • 2n2 vs. n3 • n2 vs. n2 • n3 vs. nlogn n = (2n) 2n2 = O(n3) n2 = O(n2) n3 = (n2) n3 O(nlgn) n = (logn) n (n2) COSC3101A

  16. Asymptotic Notations in Equations • On the right-hand side • (n2) stands for some anonymous function in (n2) 2n2 + 3n + 1 = 2n2 + (n) means: There exists a function f(n)  (n) such that 2n2 + 3n + 1 = 2n2 + f(n) • On the left-hand side 2n2 + (n) = (n2) No matter how the anonymous function is chosen on the left-hand side, there is a way to choose the anonymous function on the right-hand side to make the equation valid. COSC3101A

  17. Limits and Comparisons of Functions Using limits for comparing orders of growth: • compare ½ n (n-1) and n2 COSC3101A

  18. Limits and Comparisons of Functions L’Hopital rule: • compare and COSC3101A

  19. Loop Invariant • A loop invariant is a relation among program variables that • is true when control enters a loop, • remains true each time the program executes the body of the loop, • and is still true when control exits the loop. • Understanding loop invariants can help us • analyze algorithms, • check for errors, • and derive algorithms from specifications. COSC3101A

  20. Proving Loop Invariants • Initialization (base case): • It is true prior to the first iteration of the loop • Maintenance (inductive step): • If it is true before an iteration of the loop, it remains true before the next iteration • Termination: • When the loop terminates, the invariant - usually along with the reason that the loop terminated - gives us a useful property that helps show that the algorithm is correct • Stop the induction when the loop terminates • Proving loop invariants works like induction COSC3101A

  21. 1 2 3 4 5 6 7 8 a1 a2 a3 a4 a5 a6 a7 a8 key Loop Invariant for Insertion Sort(1) Alg.:INSERTION-SORT(A) for j ← 2to n do key ← A[ j ] Insert A[ j ] into the sorted sequence A[1 . . j -1] i ← j - 1 while i > 0 and A[i] > key do A[i + 1] ← A[i] i ← i – 1 A[i + 1] ← key Invariant: at the start of the for loop the elements in A[1 . . j-1] arein sorted order COSC3101A

  22. Loop Invariant for Insertion Sort(2) • Initialization: • Just before the first iteration, j = 2: the subarray A[1 . . j-1] = A[1], (the element originally in A[1]) – is sorted COSC3101A

  23. Loop Invariant for Insertion Sort(3) • Maintenance: • the while inner loop moves A[j -1], A[j -2], A[j -3], and so on, by one position to the right until the proper position for key(which has the value that started out in A[j]) is found • At that point, the value of keyis placed into this position. COSC3101A

  24. Loop Invariant for Insertion Sort(4) • Termination: • The outer for loop ends when j > n (i.e, j = n + 1)  j-1 = n • Replace nwith j-1 in the loop invariant: • the subarray A[1 . . n] consists of the elements originally in A[1 . . n], but in sorted order • The entire array is sorted! j - 1 j COSC3101A

  25. Steps in Designing Algorithms(1) • Understand the problem • Specify the range of inputs the algorithm should handle • Learn about the model of the implementation technology • RAM (Random-access machine), sequential execution • Choosing between an exact and an approximate solution • Some problems cannot be solved exactly: nonlinear equations, evaluating definite integrals • Exact solutions may be unacceptably slow • Choose the appropriate data structures COSC3101A

  26. Steps in Designing Algorithms(2) • Choose an algorithm design technique • General approach to solving problems algorithmically that is applicable to a variety of computational problems • Provide guidance for developing solutions to new problems • Specify the algorithm • Pseudocode: mixture of natural and programming language • Prove the algorithm’s correctness • Algorithm yields the correct result for any legitimate input, in a finite amount of time • Mathematical induction, loop-invariants COSC3101A

  27. Steps in Designing Algorithms(3) 8. Analyze the Algorithm • Predicting the amount of resources required: • memory: how much space is needed? • computational time: how fast the algorithm runs? • FACT: running time grows with the size of the input • Input size (number of elements in the input) • Size of an array, polynomial degree, # of elements in a matrix, # of bits in the binary representation of the input, vertices and edges in a graph Def: Running time = the number of primitive operations (steps) executed before termination • Arithmetic operations (+, -, *), data movement, control, decision making (if, while), comparison COSC3101A

  28. Steps in Designing Algorithms(4) • Coding the algorithm • Verify the ranges of the input • Efficient/inefficient implementation • It is hard to prove the correctness of a program (typically done by testing) COSC3101A

  29. By problem types Sorting Searching String processing Graph problems Combinatorial problems Geometric problems Numerical problems By design paradigms Divide-and-conquer Incremental Dynamic programming Greedy algorithms Randomized/probabilistic Classification of Algorithms COSC3101A

  30. Divide-and-Conquer • Divide the problem into a number of subproblems • Similar sub-problems of smaller size • Conquer the sub-problems • Solve the sub-problems recursively • Sub-problem size small enough  solve the problems in straightforward manner • Combine the solutions to the sub-problems • Obtain the solution for the original problem COSC3101A

  31. Merge Sort Approach • To sort an array A[p . . r]: • Divide • Divide the n-element sequence to be sorted into two subsequences of n/2 elements each • Conquer • Sort the subsequences recursively using merge sort • When the size of the sequences is 1 there is nothing more to do • Combine • Merge the two sorted subsequences COSC3101A

  32. Merge Sort r p q Alg.: MERGE-SORT(A, p, r) if p < rCheck for base case then q ← (p + r)/2Divide MERGE-SORT(A, p, q)Conquer MERGE-SORT(A, q + 1, r) Conquer MERGE(A, p, q, r)Combine • Initial call:MERGE-SORT(A, 1, n) 1 2 3 4 5 6 7 8 5 2 4 7 1 3 2 6 COSC3101A

  33. 1 2 3 4 5 6 7 8 q = 4 5 2 4 7 1 3 2 6 1 2 3 4 5 6 7 8 5 2 4 7 1 3 2 6 1 2 3 4 5 6 7 8 5 2 4 7 1 3 2 6 5 1 2 3 4 6 7 8 5 2 4 7 1 3 2 6 Example – n Power of 2 Example COSC3101A

  34. 1 2 3 4 5 6 7 8 1 2 2 3 4 5 6 7 1 2 3 4 5 6 7 8 2 4 5 7 1 2 3 6 1 2 3 4 5 6 7 8 2 5 4 7 1 3 2 6 5 1 2 3 4 6 7 8 5 2 4 7 1 3 2 6 Example – n Power of 2 COSC3101A

  35. 1 2 3 4 5 6 7 8 9 10 11 q = 6 4 7 2 6 1 4 7 3 5 2 6 1 2 3 4 5 6 7 8 9 10 11 q = 3 4 7 2 6 1 4 7 3 5 2 6 q = 9 1 2 3 4 5 6 7 8 9 10 11 4 7 2 6 1 4 7 3 5 2 6 1 2 3 4 5 6 7 8 9 10 11 4 7 2 6 1 4 7 3 5 2 6 1 2 4 5 7 8 4 7 6 1 7 3 Example – n Not a Power of 2 COSC3101A

  36. 1 2 3 4 5 6 7 8 9 10 11 1 2 2 3 4 4 5 6 6 7 7 1 2 3 4 5 6 7 8 9 10 11 1 2 4 4 6 7 2 3 5 6 7 1 2 3 4 5 6 7 8 9 10 11 2 4 7 1 4 6 3 5 7 2 6 1 2 3 4 5 6 7 8 9 10 11 4 7 2 1 6 4 3 7 5 2 6 1 2 4 5 7 8 4 7 6 1 7 3 Example – n Not a Power of 2 COSC3101A

  37. r p q 1 2 3 4 5 6 7 8 2 4 5 7 1 2 3 6 Merging • Input: Array Aand indices p, q, rsuch that p ≤ q < r • Subarrays A[p . . q] and A[q + 1 . . r] are sorted • Output: One single sorted subarray A[p . . r] COSC3101A

  38. Merging • Idea for merging: • Two piles of sorted cards • Choose the smaller of the two top cards • Remove it and place it in the output pile • Repeat the process until one pile is empty • Take the remaining input pile and place it face-down onto the output pile COSC3101A

  39. p q 2 4 5 7  L q + 1 r r p q 1 2 3 6  R 1 2 3 4 5 6 7 8 2 4 5 7 1 2 3 6 n2 n1 Merge - Pseudocode Alg.:MERGE(A, p, q, r) • Compute n1and n2 • Copy the first n1 elements into L[1 . . n1 + 1] and the next n2 elements into R[1 . . n2 + 1] • L[n1 + 1] ← ;R[n2 + 1] ←  • i ← 1; j ← 1 • for k ← pto r • do if L[ i ] ≤ R[ j ] • then A[k] ← L[ i ] • i ←i + 1 • else A[k] ← R[ j ] • j ← j + 1 COSC3101A

  40. p q r Example: MERGE(A, 9, 12, 16) COSC3101A

  41. Example: MERGE(A, 9, 12, 16) COSC3101A

  42. Example (cont.) COSC3101A

  43. Example (cont.) COSC3101A

  44. Example (cont.) Done! COSC3101A

  45. Running Time of Merge • Initialization (copying into temporary arrays): • (n1 + n2) = (n) • Adding the elements to the final array (the last for loop): • n iterations, each taking constant time  (n) • Total time for Merge: • (n) COSC3101A

  46. Analyzing Divide-and Conquer Algorithms • The recurrence is based on the three steps of the paradigm: • T(n) – running time on a problem of size n • Divide the problem into a subproblems, each of size n/b: takes D(n) • Conquer (solve) the subproblems aT(n/b) • Combine the solutions C(n) (1) if n ≤ c T(n) = aT(n/b) + D(n) + C(n) otherwise COSC3101A

  47. MERGE-SORT Running Time • Divide: • compute qas the average of pand r:D(n) = (1) • Conquer: • recursively solve 2 subproblems, each of size n/2  2T (n/2) • Combine: • MERGE on an n-element subarray takes (n) time  C(n) = (n) (1) if n =1 T(n) = 2T(n/2) + (n) if n > 1 COSC3101A

  48. Loop invariant (at the start of the for loop) A[p…k-1] contains the k-p smallest elements of L[1 . . n1 + 1] and R[1 . . n2 + 1] in sorted order L[i] and R[j] are the smallest elements not yet copied back to A Correctness of Merge Sort p r COSC3101A

  49. Proof of the Loop Invariant • Initialization • Prior to first iteration: k = p  subarray A[p..k-1] is empty • A[p..k-1] contains the k – p = 0 smallest elements of L and R • L and R are sorted arrays (i = j = 1) L[1] and R[1] are the smallest elements in L and R COSC3101A

  50. Proof of the Loop Invariant • Maintenance • Assume L[i] ≤ R[j]  L[i] is the smallest element not yet copied to A • After copying L[i] into A[k], A[p..k] contains the k – p + 1 smallest elements of L and R • Incrementing k (for loop) and i reestablishes the loop invariant COSC3101A

More Related