1 / 23

Dynamic Programming

Dynamic Programming. Heejin Park College of Information and Communications Hanyang University. Content. Introduction Assembly-line scheduling Matrix-chain multiplication Elements of dynamic programming Longest common subsequence. Matrix-chain multiplication. Matrix-chain multiplication

Download Presentation

Dynamic Programming

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dynamic Programming Heejin Park College of Information and Communications Hanyang University

  2. Content • Introduction • Assembly-line scheduling • Matrix-chain multiplication • Elements of dynamic programming • Longest common subsequence

  3. Matrix-chain multiplication • Matrix-chain multiplication • Given a chain of n matrices <A1, A2, ..., An>, compute the product A1A2…An. • The number of ways to compute the product A1A2 …An. = the number of ways to fully parenthesize the product. • The number of ways to compute the product A1A2A3 is 2. • (A1A2) A3 • A1(A2 A3)

  4. Matrix-chain multiplication • The order of multiplications • The order of multiplications does not change the value of the product because matrix multiplication is associative. • For example, whether the left multiplication is done first or the right multiplication is done first does not matter. ( A1·A2) ·A3 = A1· (A2·A3) • However, the order of multiplication affects the number of scalar multiplications needed to compute the product.

  5. (A2x3) (B 3x2) (C2x2) Matrix-chain multiplication • Multiplying two matrices A and B • We can multiply them if they are compatible: the number of columns of A must equal the number of rows of B. • If A is a p×qmatrix and B is aq×rmatrix, the resulting matrix is a p×r matrix. x =

  6. Matrix-chain multiplication • The number of scalar multiplications to multiply A and B. • It is pqr because we compute pr elements and computing each element needs qscalar multiplications.

  7. Matrix-chain multiplication • The order of multiplications affects the number of scalar multiplications. • Computing A1A2A3 where A1: 10×100 A2: 100×5 A3 : 5×50 • (A1A2) A3 • (A1A2) = 10*100*5 = 5000, (10 ×5) A3 = 10*5*50 = 2500 =>5000 + 2500 = 7,500 • A1 (A2A3) • (A2A3) = 100*5*50 =25000, A1(100 × 50) = 10 *100*50 = 50000 =>25000 + 50000 = 75,000 • Computing (A1A2) A3 is 10 times faster.

  8. Matrix-chain multiplication • Matrix-chain multiplication problem • Given a chain A1, A2, ..., An of n matrices, where matrix Ai has dimension pi-1×pi, find the order of matrix multiplications minimizing the scalar multiplications to compute the product. • That is, to fully parenthesize the product of matrices minimizing scalar multiplications. • For example, for the product A1A2A3A4, a fully parenthesization is ((A1A2) A3) A4.

  9. Matrix-chain multiplication • Solutions of the matrix-chain multiplication problem • Brute-force approach • Enumerate all possible parenthesizations. • Compute the number of scalar multiplications of each parenthesization. • Select the parenthesization needing the least number of scalar multiplications.

  10. Matrix-chain multiplication • The Brute-force approach is inefficient. • The number of enumerated parenthesizations is Ω(4n/n3/2). • The number of parenthesizations of a product of n matrices, denoted by P(n), is as follows. if n=1 if n≥2

  11. Matrix-chain multiplication • The product A1A2A3A4 can be fully parenthesized in five distinct ways. A1(A2(A3A4)), A1((A2A3) A4), (A1A2)(A3A4), (A1(A2A3))A4, ((A1A2) A3) A4.

  12. Matrix-chain multiplication • P(n): The number of alternative parenthesizations of a sequence of n matrices. • The split between the two subproducts may occur between the kth and (k + 1)st matrices for any k = 1, 2, ..., n – 1. if n=1 if n≥2

  13. if i=j if i<j Matrix-chain multiplication • Dynamic programming • Optimal substructure • m[i,j]: The minimum number of scalar multiplications for computing AiAi+1 …Aj. • matrix Ai : pi-1× pi • computing Ai..k Ak+1..jtakes pi-1pk pjscalar multiplications. • s[i, j] stores the optimal kfor tracing the optimal solution.

  14. Matrix-chain multiplication 15750 7875 9375 11875 15125 1 1 3 3 3 3 2625 4375 7125 10500 2 3 3 750 5375 3 3 3 2500 1000 3500 4 5 5 5000 s m +2500 +35·15·20 0 =13000, 2625 +1000 +35·5·20 =7125, 4375 +0 +35·10·20 =11375 i=2, j=5, i≤k<j

  15. Matrix-chain multiplication • Running time • O(n3) time in total • Θ(n2) subproblems • O(n) time for each subproblem • Space consumption • Θ(n2) space to store the m and s tables.

  16. Content • Introduction • Assembly-line scheduling • Matrix-chain multiplication • Elements of dynamic programming • Longest common subsequence

  17. Elements of dynamic programming • Elements of dynamic programming • Optimal substructure • Overlapping subproblems

  18. Element of dynamic programming • Subtleties • Unweighted longest simple path problem • Does it have optimal substructure? p1 p2 u w v p

  19. Elements of dynamic programming q r s t

  20. Elements of dynamic programming • Overlapping subproblems • When a recursive algorithm revisits the same problem over and over again, that is the optimization problem has overlapping subproblems.

  21. Elements of dynamic programming 1...4 Matrix chain multiplication: top-down vs. bottom-up 1...1 2...4 1...2 3...4 1...3 4...4 2...2 3...4 2...3 4...4 1...1 2...2 3...3 4...4 1...1 2...3 1...2 3...3 3...3 4...4 2...2 3...3 2...2 3...3 1...1 2...2

  22. Elements of dynamic programming • Memoization • Recursive solution but solve each subproblem only once. • Fills the table in recursive way. • In most case, it is slower than dynamic programming. • It is useful when only a part of subproblems is solved.

  23. Elements of dynamic programming • The running time of a dynamic-programming algorithm depends on the product of two factors. • The number of subproblems overall. • How many choiceseach subproblem has. • Assembly line scheduling • Θ(n) subproblems · 2 choices = Θ(n) • Matrix chain multiplication • Θ(n2) subproblems · (n-1) choices = O(n3)

More Related