1 / 18

Dynamic Programming

Dynamic Programming. Nithya Tarek. Dynamic Programming. Dynamic programming solves problems by combining the solutions to sub problems. Paradigms: Divide and conquer Greedy Algorithm Dynamic programming. String Matching. Longest common subsequence Knuth-Morris-Pratt pattern matching.

misu
Download Presentation

Dynamic Programming

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dynamic Programming Nithya Tarek

  2. Dynamic Programming • Dynamic programming solves problems by combining the solutions to sub problems. • Paradigms: • Divide and conquer • Greedy Algorithm • Dynamic programming

  3. String Matching • Longest common subsequence • Knuth-Morris-Pratt pattern matching

  4. Example – Fibonacci Numbersusing recursion Function f(n) if n = 0 output 0 else if n = 1 output 1 else output f(n-1) + f(n-2)

  5. Example – Fibonacci Numbersusing recursion • Run time: T(n) = T(n-1) + T(n-2) • This function grows as n grows • The run time doubles as n grows and is order O(2n). • This is a bad algorithm as there are numerous repetitive calculations.

  6. Example – Fibonacci Numbersusing Dynamic programming f(5) f(4) f(3) f(3) f(2) f(2) f(1) f(2) f(1)

  7. Example – Fibonacci Numbersusing Dynamic programming • Dynamic programming calculates from bottom to top. • Values are stored for later use. • This reduces repetitive calculation.

  8. Longest Common Subsequence • The input to this problem is two sequences S1=abcdace and S2=badcabe. The problem is to find the longest sequence that is a subsequence of both S1 and S2 where S1 ≠ S2. • Distance between S1 and S2 is defined as the number of characters we have to remove from one string and add to that string to make S1 and S2 equal.

  9. Longest Common Subsequence S1: a b c d a c e S2: b a d c a b e • Length LCSS = 4 • Edit Distance = 3(remove) +3(add) = 6

  10. Longest Common Subsequence • Theorem: |S1| = m, |S2|=n LCSS = L Edit distance = m + n – 2L

  11. Longest Common Subsequence S1i S1 i S2j S2 j

  12. Longest Common Subsequence To find LCSS(S1i, S2j) If S1[i] = S2[j] Then return LCSS[S1i-1, S2j-1] + 1 Else return max{LCSS[S1i-1, S2j ], LCSS[S1i, S2j-1 ] } • This algorithm is very slow S1 i S2 j

  13. Solving LCSS using Dynamic programming • LCSS Matrix: • The last entry in the matrix shows the LCSS

  14. Solving LCSS using Dynamic programming • Runtime for this matrix using Dynamic programming is order of O(m,n) • O(1) times to fill up each entry • There are mn entries. • Space required: O(mn)

  15. Advantages of Dynamic programming • A recursive procedure does not have memory • Dynamic programming stores previous values to avoid multiple calculations

  16. Space and Time • The space can be reduced to order of O(min{m,n}) • It is enough to keep only two rows j and j-1 • After calculating the entries for row j move that row up to row j-1 and delete row j-1 and get the new entries for row j. • The time cannot be reduced

  17. Space reduction • To compare for a smaller area w, where w is the window size, the matrix size will reduce. S1: a b c d a c e S2: b a d c a b e 3 3 w w

  18. Space reduction • Specifying the window size reduces the number of calculation • The runtime of this algorithm is: O(2w * min{m,n}) It is a linear time algorithm and the space required: O(w)

More Related