1 / 60

Dynamic Programming

Dynamic Programming. 15-211 Fundamental Data Structures and Algorithms. Peter Lee March 18, 2003. Plan. Today Dynamic programming Homework HW5 available later today! Reading Section 7.6. From Exponential Time to Linear Time. Fibonacci. Leonardo Pisano

mika
Download Presentation

Dynamic Programming

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dynamic Programming 15-211 Fundamental Data Structures and Algorithms Peter Lee March 18, 2003

  2. Plan • Today • Dynamic programming • Homework • HW5 available later today! • Reading • Section 7.6

  3. From Exponential Time to Linear Time

  4. Fibonacci • Leonardo Pisano • aka Leonardo Fibonacci • How many rabbits can be produced from a single pair in a year’s time? (1202) • Assume • New pair of offspring each month • Each pair becomes fertile after one month • Rabbits never die

  5. Fibonacci • The Fibonacci sequence, inductively • f(0) = 0 • f(1) = 1 • f(n) = f(n-1) + f(n-2)

  6. Fibonacci – the recursive program public static long fib(int n) { if (n<=1) return n; long u = fib(n-1); long v = fib(n-2); return u + v; }

  7. Recursive Fibonacci • What is the running time of recursive Fibonacci?

  8. Improving Fibonacci • A simple idea: • Let’s save all of the previous results • When computing the value of fib(n-1) and fib(n-2), return the saved result rather than calculating it again

  9. But beware… • It is important that fib() is a “pure” function! • No side effects • such as I/O, changes to global variables, etc. • Returns the same value each time • given the same arguments, fib() must return the same result

  10. Memoization • Name coined by Donald Michie, Univ of Edinburgh (1960s) • Fib() is the Perfect Example • Also useful for • game searches • evaluation functions • web caching

  11. Fibonacci – the recursive program public static long fib(int n) { if (memo[n] != -1) return memo[n]; if (n<=1) return n; long u = fib(n-1); long v = fib(n-2); memo[n] = u + v; return u + v; }

  12. Fibonacci – the recursive program public static long fib(int n) { if (memo[n] != -1) return memo[n]; if (n<=1) return n; long u = fib(n-1); long v = fib(n-2); memo[n] = u + v; return u + v; } memo = new long[n+1]; for(int i=0; i<=n; i++) memo[i] = -1;

  13. Recursive Fibonacci • What is the running time of our new version of Fibonacci?

  14. Fibonacci – optimizing the memo table public static long fib(int n) { if (n<=1) return n; long last = 1; long prev = 0; long t = -1; for (int i = 2; i<=n; i++) { t = last + prev; prev = last; last = t; } return t; } Reduce memo table to two entries

  15. Fibonacci – static table Amortize table building cost against multiple calls to fib. static long[] tab; public static void buildTable(int n) { tab = new long[n+1]; tab[0] = 0; tab[1] = 1; for (int i = 2; i<=n; i++) tab[i] = tab[i-1] + tab[i-2]; return; } public static long fib(int n) { return tab[n]; }

  16. Memoization • Works best with “pure” functions • No side effects • Returns the same value each time • Lots of ways to save the values • Arrays • Hashtables • … • Trade-offs • Time to retrieve vs. time to compute • Storage space vs. time to compute

  17. Recall: Greedy Algorithms

  18. The Fractional Knapsack Problem (FKP) • You rob a store: find n kinds of items • Gold dust. Wheat. Beer. • The total inventory for the i th kind of item: • Weight: wi pounds • Value: vi dollars • Knapsack can hold a maximum of Wpounds. • Q: how much of each kind of item should you take? (Can take fractional weight)

  19. FKP: a greedy solution • Fill knapsack with “most valuable” item until all is taken. • Mostvaluable = vi /wi (dollars per pound) • Then next “most valuable” item, etc. • Repeat until knapsack is full.

  20. The Binary Knapsack Problem • You win the Supermarket Shopping Spree contest. • You are given a shopping cart with capacity C. • You are allowed to fill it with any items you want from Giant Eagle. • Giant Eagle has items 1, 2, … n, which have values v1, v2, …, vn, and sizes s1, s2, …, sn. • How do you (efficiently) maximize the value of the items in your cart?

  21. BKP is not greedy • The obvious greedy strategy of taking the maximum value item that still fits in the cart does not work. • Consider: • Suppose item i has size si = C and value vi. • It can happen that there are items j and k with combined size sj+skC but vj+vk > vi.

  22. $220 $160 $180 Maximum weight = 50 lbs $120, 30 lbs $100, 20 lbs $60, 10 lbs (optimal) item 1 item 2 item 3 cart BKP: Greedy approach fails BKP has optimal substructure, but not greedy-choice property: optimal solution does not contain greedy choice.

  23. A Solution: Dynamic Programming

  24. Consider the brute-force solution • The brute-force solution: • Try all possible combinations of items and compute the max value that fits in C.

  25. But isn’t this impractical? • Although seemingly impractical, the brute-force approach usually has the advantage that it can be formulated simply and precisely.

  26. Precise formulation • Let V(k, A) be the max value when choosing among items 1, 2, …, k and with a shopping cart of size A. • V(k, A) is a subproblem of the final problem of V(n, C). • (where n is the number of items in Giant Eagle)

  27. Subproblem formulation, cont’d • V(k, A) = max of • V(k-1, A) • [that is, don’t take the kth item] • V(k-1, A – sk) + vk • [or else take the kth item] Note the recursive structure (and similarity to Fibonacci). Can you write down the recurrence equations?

  28. V 0 1 2 3 … C 0 1 2 3 … n The memoization table Let v1=10, s1=3, v2=2, s2=1, v3=9, s3=2 A k

  29. V 0 1 2 3 … C 0 1 2 3 … n The memoization table Let v1=10, s1=3, v2=2, s2=1, v3=9, s3=2 A 0 0 0 0 … 0 k

  30. V 0 1 2 3 … C 0 1 2 3 … n The memoization table Let v1=10, s1=3, v2=2, s2=1, v3=9, s3=2 A 0 0 0 0 … 0 0 0 0 10 … 10 k

  31. V 0 1 2 3 … C 0 1 2 3 … n The memoization table Let v1=10, s1=3, v2=2, s2=1, v3=9, s3=2 A 0 0 0 0 … 0 0 0 0 10 … 10 0 2 2 10 … 12 k

  32. V 0 1 2 3 … C 0 1 2 3 … n The memoization table Let v1=10, s1=3, v2=2, s2=1, v3=9, s3=2 A 0 0 0 0 … 0 0 0 0 10 … 10 0 2 2 10 … 12 k 0 2 9 11 …

  33. Observations • Inefficient, or “brute-force,” solutions to dynamic programming problems often have simple recursive definitions.

  34. Quiz Break

  35. Counting change, revisited • How would you solve the change-counting problem (when you have a 12-cent coin) by using dynamic programming?

  36. Dynamic programming • Build up to a solution. • Solve all smaller subproblems first. • Typically start with smallest subproblems and then work up to larger ones. • Hope that there is substantial overlap between subproblems. • Combine solutions to get answers to larger subproblems.

  37. Dynamic programming • Underlying idea: • Use memoization so that overlap can be exploited. • As smaller subproblems are solved, solving the larger subproblems might get easier. • Can sometimes reduce seemingly exponential problems to polynomial time.

  38. Using dynamic programming • Key ingredients: • Simple subproblems. • Problem can be broken into subproblems, typically with solutions that are easy to store in a table/array. • Subproblem optimization. • Optimal solution is composed of optimal subproblem solutions. • Subproblem overlap. • Optimal solutions to separate subproblems can have subproblems in common.

  39. Longest Common Subsequence Problem

  40. CGATAATTGAGA A subsequence AAAG String subsequences • Consider DNA sequences. • Strings over the alphabet {A,C,G,T}. • Given a string X = x0x1x2…xn-1: • a subsequence of X is any string that is of the form xi1xi2…xik where ij<ij+1. • Example:

  41. LCS = 6 LCS problem • The longest common subsequence problem: • Given two strings X and Y, find the longest string S that is a subsequence of both X and Y. CGATAATTGAGA GTTCCTAATA

  42. A brute-force approach • A brute-force algorithm: • for each subsequence S of X: • determine whether S is a subsequence of Y. • if yes, then remember if it is the longest encountered so far. • Return the longest subsequence found. • Requires O(m2n) time! • There are 2n different subsequences of X. • Determining whether S is a subsequence of Y requires m time.

  43. 01234567891011 Y = CGATAATTGAGA GTTCCTAATA X = 0123456789 DP ingredients • Simple subproblems? • (that are easy to store in a table?) • Yes! • Let L(i,j) = LCS for the first i+1 letters of X and first j+1 letters of Y. • L(-1,j) = L(i,-1) = 0. L(8,9) = 5

  44. L -1 0 1 2 3 4 5 6 7 8 9 1011 -1 0 1 2 3 4 5 6 7 8 9 Storing solutions in a table Y 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 A memo table X 5

  45. DP ingredients, cont’d • Subproblem optimization? • Yes! Optimal solution for L(i,j) is a combination of optimal solutions for L(k,l) for some k,l such that ki and lj. • Let X = x0x1x2…xn-1 and Y = y0y1y2…ym-1.

  46. 01234567891011 Y = CGATAATTGAGA L(8,10) = 5 GTTCCTAATA X = 0123456789 DP ingredients, cont’d • Subproblem optimization, Case 1: Example: Suppose i=9 and j=11 • If xi=yj, then • L(i,j) = L(i-1,j-1) + 1

  47. 012345678910 Y = CGATAATTGAG L(9,9) = 6 L(8,10) = 5 GTTCCTAATA X = 0123456789 Ingredients, cont’d • Subproblem optimization, Case 2: Example: Suppose i=9 and j=10 L(i,j) = max(L(i-1,j),L(i,j-1)).

  48. DP ingredients, cont’d • Subproblem optimization? • Yes! Optimal solution for L(i,j) is a combination of optimal solutions for L(k,l) for some k,l such that ki and lj. • Let X = x0x1x2…xn-1 and Y = y0y1y2…ym-1. • Case 1: If xi=yj, then • L(i,j) = L(i-1,j-1) + 1. • Case 2: If xiyj, then • L(i,j) = max(L(i-1,j),L(i,j-1)).

More Related