1 / 23

UMass Lowell Computer Science 91.503 Analysis of Algorithms Prof. Karen Daniels Fall, 2006

UMass Lowell Computer Science 91.503 Analysis of Algorithms Prof. Karen Daniels Fall, 2006. Lecture 1 (Part 3) Design Patterns for Optimization Problems Dynamic Programming & Greedy Algorithms. Algorithmic Paradigm Context. Divide &. Dynamic. Conquer. Programming.

Download Presentation

UMass Lowell Computer Science 91.503 Analysis of Algorithms Prof. Karen Daniels Fall, 2006

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UMass Lowell Computer Science 91.503Analysis of AlgorithmsProf. Karen DanielsFall, 2006 Lecture 1 (Part 3) Design Patterns for Optimization Problems Dynamic Programming & Greedy Algorithms

  2. Algorithmic Paradigm Context Divide & Dynamic Conquer Programming View problem as collection of subproblems “Recursive” nature Independent subproblems Overlapping subproblems Number of subproblems depends on typically small partitioning factors Preprocessing Characteristic running time typically log depends on number function of n and difficulty of subproblems Primarily for optimization problems Optimal substructure: optimal solution to problem contains within it optimal solutions to subproblems

  3. Dynamic Programming Approach to Optimization Problems • Characterize structure of an optimal solution. • Recursively define value of an optimal solution. • Compute value of an optimal solution in bottom-up fashion. • Construct an optimal solution from computed information. source: 91.503 textbook Cormen, et al.

  4. Matrix Parenthesization Dynamic Programming

  5. Example: Matrix Parenthesization Definitions • Given “chain” of n matrices: <A1, A2, … An, > • Compute product A1A2… An efficiently • Minimize “cost” = number of scalar multiplications • Multiplication order matters! source: 91.503 textbook Cormen, et al.

  6. Example: Matrix Parenthesization Step 1: Characterizing an Optimal Solution • Observation: • Any parenthesization of AiAi+1… Ajmust split it between Ak and Ak+1 for some k. • THM: Optimal Matrix Parenthesization: • If an optimal parenthesization of AiAi+1… Ajsplits at k, then • parenthesization of prefix AiAi+1… Akmust be an optimal parenthesization. • Why? • If existed less costly way to parenthesize prefix, then substituting that parenthesization would yield less costly way to parenthesize AiAi+1… Aj, contradicting optimality of that parenthesization. common DP proof technique: “cut-and-paste” proof by contradiction source: 91.503 textbook Cormen, et al.

  7. Example: Matrix Parenthesization Step 2: A Recursive Solution • Recursive definition of minimum parenthesization cost: 0 if i = j m[i,j]= min{m[i,k] + m[k+1,j] + pi-1pkpj} if i < j i <= k < j How many distinct subproblems? each matrix Ai has dimensions pi-1 x pi source: 91.503 textbook Cormen, et al.

  8. 2,500 2,625 1,000 0 Example: Matrix Parenthesization Step 3: Computing Optimal Costs s: value of k that achieves optimal cost in computing m[i, j] source: 91.503 textbook Cormen, et al.

  9. Example: Matrix Parenthesization Step 4: Constructing an Optimal Solution PRINT-OPTIMAL-PARENS(s, i, j) if i = j then print “A”i else print “(“ PRINT-OPTIMAL-PARENS(s, i, s[i, j]) PRINT-OPTIMAL-PARENS(s, s[i, j]+1, j) print “)“ source: 91.503 textbook Cormen, et al.

  10. LOOKUP-CHAIN(p,i,j) 1 if m[i,j] < 2 then return m[i,j] 3 if i=j 4 then m[i,j] 0 5 elsefor k i to j-1 6 do q LOOKUP-CHAIN(p,i,k) + LOOKUP-CHAIN(p,k+1,j) + pi-1 pk pj 7 if q < m[i,j] 8 then m[i,j] q 9 return m[i,j] MEMOIZED-MATRIX-CHAIN(p) 1 n length[p] - 1 2 for i 1 to n 3 do for j i to n 4 do m[i,j] 5 return LOOKUP-CHAIN(p,1,n) Example: Matrix Parenthesization Memoization source: 91.503 textbook Cormen, et al. • Provide Dynamic Programming efficiency • But with top-down strategy • Use recursion • Fill in table “on demand” • Example: • RECURSIVE-MATRIX-CHAIN:

  11. Longest Common Subsequence Dynamic Programming

  12. Example: Longest Common Subsequence (LCS): Motivation • Strand of DNA: string over finite set {A,C,G,T} • each element of set is a base: adenine, guanine, cytosine or thymine • Compare DNA similarities • S1 = ACCGGTCGAGTGCGCGGAAGCCGGCCGAA • S2 = GTCGTTCGGAATGCCGTTGCTCTGTAAA • One measure of similarity: • find the longest string S3 containing bases that also appear (not necessarily consecutively) in S1 and S2 • S3 = GTCGTCGGAAGCCGGCCGAA source: 91.503 textbook Cormen, et al.

  13. Example: LCS Definitions source: 91.503 textbook Cormen, et al. • Sequence is a subsequence of if (strictly increasing indices of X) such that • example: is subsequence of with index sequence • Z is common subsequence of X and Y if Z is subsequence of both X and Y • example: • common subsequence but not longest • common subsequence. Longest? Longest Common Subsequence Problem: Given 2 sequences X, Y, find maximum-length common subsequence Z.

  14. Example: LCS Step 1: Characterize an LCS THM 15.1: Optimal LCS Substructure Given sequences: For any LCS of X and Y: 1 if then and Zk-1 is an LCS of Xm-1 and Yn-1 2 if then Z is an LCS of Xm-1 and Y 3 if then Z is an LCS of X and Yn-1 PROOF: based on producing contradictions 1 a) Suppose . Appending to Z contradicts longest nature of Z. b) To establish longest nature of Zk-1, suppose common subsequence W of Xm-1and Yn-1has length > k-1. Appending to W yields common subsequence of length > k = contradiction. 2 Common subsequence W of Xm-1and Y of length > k would also be common subsequence of Xm, Y, contradicting longest nature of Z. 3 Similar to proof of (2) source: 91.503 textbook Cormen, et al.

  15. ? Example: LCS Step 2: A Recursive Solution • Implications of Thm 15.1: no yes Find LCS(Xm-1, Yn-1) Find LCS(X, Yn-1) Find LCS(Xm-1, Y) LCS1(X, Y) = LCS(Xm-1, Yn-1) + xm LCS2(X, Y) = max(LCS(Xm-1, Y), LCS(X, Yn-1))

  16. 0 if i=0 or j=0 c[i,j]= c[i-1,j-1]+1 if i,j > 0 and xi=yj max(c[i,j-1], c[i-1,j]) if i,j > 0 and xi=yj Example: LCS Step 2: A Recursive Solution (continued) source: 91.503 textbook Cormen, et al. • Overlapping subproblem structure: • Recurrence for length of optimal solution: Q(mn) distinct subproblems Conditions of problem can exclude some subproblems!

  17. C B B A B C B A Example: LCS Step 3: Compute Length of an LCS What is the asymptotic worst-case time complexity? 0 1 2 3 4 c table (represent b table) source: 91.503 textbook Cormen, et al.

  18. Example: LCS Step 4: Construct an LCS source: 91.503 textbook Cormen, et al.

  19. Example: LCS Improve the Code source: 91.503 textbook Cormen, et al. • Can eliminate b table • c[i,j] depends only on 3 other c table entries: • c[i-1,j-1] c[i-1,j] c[i,j-1] • given value of c[i,j], can pick the one in O(1) time • reconstruct LCS in O(m+n) time similar to PRINT-LCS • same Q(mn) space, but Q(mn) was needed anyway... • Asymptotic space reduction • leverage: need only 2 rows of c table at a time • row being computed • previous row • can also do it with ~ space for 1 row of c table • but does not preserve LCS reconstruction data

  20. Activity Selection Dynamic Programming

  21. Activity Selection Optimization Problem • Problem Instance: • Set S = {1,2,...,n} of n activities • Each activity i has: • start time: si • finish time : fi • Activities i, j are compatible iff non-overlapping: • Objective: • select a maximum-sized set of mutually compatible activities source: 91.503 textbook Cormen, et al.

  22. source: 91.503 textbook Cormen, et al.

  23. Algorithmic Progression • “Brute-Force” • (board work) • Dynamic Programming #1 • Exponential number of subproblems • (board work) • Dynamic Programming #2 • Quadratic number of subproblems • (board work) • Greedy Algorithm • (board work: next week)

More Related