1 / 12

(not Daniel Park's dance party)

DP. (not Daniel Park's dance party). Dynamic programming. Can speed up many problems. Basically, it's like magic. :D Overlapping subproblems Number of distinct subproblems needs to be small. Don't get confused with divide and conquer (NON-overlapping subproblems) Optimal substructure

brie
Download Presentation

(not Daniel Park's dance party)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DP (not Daniel Park's dance party)

  2. Dynamic programming • Can speed up many problems. • Basically, it's like magic. :D • Overlapping subproblems • Number of distinct subproblems needs to be small. • Don't get confused with divide and conquer (NON-overlapping subproblems) • Optimal substructure • Solve larger subproblem using solutions to smaller subproblems • On USACO, if your naive method's runtime is exponential, then the problem is likely to be DP

  3. Two Approaches • Top-Down (systematic recursion) • If the problem has been solved already, return the answer. • If it hasn't been solved, solve it and save the answer • also called "memoizing" (no 'r') • "Easier" to code :(( • Russian way • Bottom-Up • Solve all problems, starting from the most trivial ones. • Useful when trying to optimize memory (ex: sliding window, heap, etc) • Pick what is easier for you. It doesn't really matter.

  4. Fibonacci • Classic first example • Find the nth fibonacci number • f(n) = f(n-1) + f(n-2) • Straightforward recursion takes exponential time: int fib(int n) { if (n == 1 || n == 2) return 1; return fib (n - 1) + fib (n - 2); } • fib(5) = fib(4) + fib(3) = ( fib(3) + fib(2) ) + fib(3) = ( fib(2) + fib(1) + fib(2) ) + fib(3) = fib(2) + fib(1) + fib(2) + fib(2) + fib(1) • fib(1) and fib(2) are computed many many times (in fact, fib(5) times) • This algorithm will take years to compute just fib(1000).

  5. Fibonacci: Two Faster Approaches • Bottom Up Way: • Keep an array fib[1...n] • fib[1] = fib[2] = 1 • fib[n] = fib[n-1] + fib[n-2] • Loop from 3 to n. • Only takes linear time • Only store last 2 values to reduce memory usage • Top Down Way (Recursive): Just modify original code. int[] dp; // big array initialized to 0. int fib(int n) { if (n == 1 || n == 2) return 1; if (dp[n] > 0) return dp[n]; // already know answer. return dp[n] = fib (n - 1) + fib (n - 2); }

  6. Number Triangles • Given a triangle of numbers, find a path from the top to the bottom such that the sum of the numbers on the path is maximized. • Top-Down approach! • Define a state, find the recursion. • A subproblem will be finding the maximum path from any location in the triangle to the bottom. • Our state is the location. • Row and column. f(r, c) • Our recursion: We have two choices • Go down-left or down-right: pick the larger. • f(r, c) = array[r][c] + max (f(r+1, c), f(r+1, c+1)) • Want to find: f(0, 0). [0-based indexing] • Don't forget to memoize when implementing!

  7. Knapsack • You have n items in a knapsack, each with weight w_i and value v_i. • You need to get the maximum value given you can only hold at most a total weight C. • Variation 1: Unlimited amount of each item • dp[0...C] where dp[i] is max value for weight i • dp[i] = max(dp[i-w[j]] +v[j]) for all items j • Variation 2: One of each item • dp[1...n][0...C] where dp[i][j] is max value with weight j and only items before item i. • dp[i][j] = max( dp[i-1][j], dp[i-1][j-w[i]] + v[i] ) • Knapsack problems are seen a lot

  8. Interval dp • Matrix chain multiplication • Find minimum number of multiplications needed. • Matrix multiplication is associative. The order makes a huge difference on the # of multiplications needed. • Example: 10 x 100, 100 x 5, 5 x 50 matrices • Multiply A and B first - 7500 multiplications • Multiply B and C first - 75,000 multiplications • Let dp[i, j] = min multiplications for matricies i...j • Must divide into two parts and multiply each • dp[i, j] = min(dp[i,k] + dp[k+1, j] + cost(i,k,j)) for k=i...j-1 • cost(i,k,j) is rows(i) * cols(k) * cols(j) - cost to multiply product A_(i...k) with A_(k+1...j) • Note that at each point you keep track of an interval and divide it into subintervals

  9. Using bitset states • Only when size is small and 2^n is ok. • Example: Planting • Farmer John has a field 9 by 250. • Some of the squares have trees in them and you can't plant there. • Farmer John also cannot plant crops in adjacent squares. • How many ways are there to plant? • Note that 2^9 is small. • Let dp[i=1...250][j=0...2^9-1] mean the ith row and j is a number where each bit 1 means crop, and 0 no crop • dp[i][j]= sum(dp[i-1][k] for k where k is valid (no crops where trees are or adjacent to crops in current row))

  10. Other DPs for your enjoyment • DP + data structure (BIT/segment tree) • DP + math • DP + pre-computation

  11. POTW: Factoring Numbers • Daniel Park is playing a game. He starts out by writing down a single number, N. • Each turn, he can erase N and replace it with two numbers a and b, such that a * b = N. (a, b > 1) • In the end, he remembers all the numbers he's ever written and sums up their digits. • Because Daniel Park likes big numbers, what is the largest possible sum he can make? • Example • N = 24. • Answer: 27 • Split 24 into 8, 3. • Split 8 into 4, 2. • Split 4 into 2, 2. • Sum = 2 + 2 + 2 + 4 + 8 + 3 + (2 + 4) = 27.

  12. POTW:CONSTRAINTS 10 POINTS: N <= 50. 20 POINTS: N <= 1,000,000 25 POINTS: N <= 1,000,000,000 (hashtable-dp)

More Related