1 / 394

Recursive Back Tracking & Dynamic Programming

Thinking about Algorithms Abstractly. Recursive Back Tracking & Dynamic Programming. Jeff Edmonds York University. Lecture 7. COSC 3101. Problems. Techniques. Optimization Problems A Sequence of Decisions The Little Bird & Friend Optimal Substructure Memoization Set of Sub-Instances

felcia
Download Presentation

Recursive Back Tracking & Dynamic Programming

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Thinking about Algorithms Abstractly Recursive Back Tracking&Dynamic Programming Jeff Edmonds York University Lecture7 COSC 3101

  2. Problems Techniques Optimization Problems A Sequence of Decisions The Little Bird & Friend Optimal Substructure Memoization Set of Sub-Instances Tracing Dyn. Prog. Alg Reversing Code Speeding Up Running Time Multiple Opt Solutions Review Question for Little Bird Review & Don'ts Best Path Printing Neatly Longest Common Subsequence Knapsack Problem The Event Scheduling Problem Parsing Satisfiability

  3. Dynamic Programming • A hard topic. • I try to provide a unified way to think of itand a fixed set of steps to follow. • Even if you don’t get the details of the algorithm correct, at least get the right structure. • I provide analogies (little bird) to make it hopefully more fun & easier to follow.

  4. Optimization Problems • An important and practical class of computational problems. • For most of these, the best known algorithm runs in exponential time. • Industry would pay dearly to have faster algorithms. • Heuristics • Some have quick Greedy or Dynamic Programming algorithms • For the rest, Recursive Back Tracking is the best option.

  5. Optimization Problems Ingredients: • Instances: The possible inputs to the problem. • Solutions for Instance: Each instance has an exponentially large set of solutions. • Cost of Solution: Each solution has an easy to compute cost or value.

  6. Optimization Problems Specification of an Optimization Problem • <preCond>: The input is one instance. • <postCond>: The output is one of the valid solutions for this instance with optimal cost. (minimum or maximum) • The solution might not be unique. • Be clear about these ingredients!

  7. Search Graph For Best Path We use it because it nicely demonstrates the concepts in a graphical way.

  8. An instance (input) consists of <G,s,t>. G is a weighted directed layered graphs source nodet sink node 5 3 4 2 3 2 5 2 2 1 3 4 Search Graph For Best Path

  9. A solution for an instance is a path from s to t. The cost of a solution is the sum of the weights. The goal is to find a path with minimum total cost. 2+6+3+7=18 4+2+1+5=12 Search Graph For Best Path An instance (input) consists of <G,s,t>.

  10. Brute Force Algorithm Try all paths, return the best. But there may be an exponential number of paths!

  11. An Algorithm As A Sequence of Decisions I ask a question about the solution. “Which edge should we take first?” Some how I decide <s,v3>. My friend asks the next question. “Which edge do we take second?” Some how he decides <v3,v5>. His friend asks the next question. “Which edge do we take third?” Some how he decided <v5,v8>.

  12. Taking the best first edge. An Algorithm As A Sequence of Decisions I ask a question about the solution. “Which edge should we take first?” How do I decide? The greedy algorithm? Does not work!

  13. Local vs Global Considerations • We are able to make local observations and choices. • Eg. Which edge out of s is cheapest? • But it is hard to see the global consequences • Which path is the overall cheapest? • Sometimes a local initial sacrifice can globally lead to a better overall solution.

  14. But let's skip this partby pretending that we have a little bird to answer this little question. An Algorithm As A Sequence of Decisions I ask a question about the solution. “Which edge should we take first?” How do I decide? • In reality we will try all possible first edges.

  15. "Little Bird" Abstraction Recall: Non-deterministic Finite Automata Non-deterministic Turing Machine 0 These have a higher power to tell them which way to go. The little bird is a little higher power, answering a little question about an optimal solution. (It is up to you whether or not to use it)

  16. But we don’t want toworry about how our friend solves his problem. Little Bird & Friend Alg I ask a question about the solution. “Which edge should we take first?” The bird answers <s,v1>. My friend asks the next question. “Which edge do we take second?” The bird answers <v1,v4>.

  17. Sub-Instance for Friend Our instance is <G,s,t>: Find best path from s to t. Our friend is recursion • i.e. he is a smaller version of ourselves • we can trust him to give us a correct answer • as long as we give him • a smaller instance • of the same problem. • What sub-instance do we give him?

  18. Little Bird & Friend Alg The bird answers <s,v1>. If I trust the little bird, I take step along edge <s,v1>and ask my friend, “Which is the best path from v1 to t?” Friend answers <v1,v6,t>with weight 10. To get my solution I tack on the bird’s edge making the path <s,v1,v6,t> with weight 10+3=13.

  19. Faulty Bird But what if we do not have a bird that we trust? This work is not wasted, because we have found • the best solution to our instance from amongst those consistent with this bird' s answer. • i.e. the best path from s to tfrom amongst those starting with <s,v1>. Define optS<I,k> to be: the optimum solution for instance I consistent with the kth bird' s answer.

  20. Faulty Bird But what if we do not have a bird that we trust? This work is not wasted, because we have found • the best solution to our instance from amongst those consistent with this bird' s answer. • i.e. the best path from s to tfrom amongst those starting with <s,v1>. In reality we will try all possible first edges, giving …..

  21. Faulty Bird …the best path from amongst those starting with <s,v1>.

  22. Faulty Bird … and the best path from amongst those starting with <s,v2>.

  23. Faulty Bird … and the best path from amongst those starting with <s,v3>.

  24. Faulty Bird … and the best path from amongst those starting with <s,v4>.

  25. I give the best of the best as the best path. Faulty Bird At least one of these four paths must be an over all best path.

  26. Bird/Friend - Best of the Best Consider our instance I. Consider the set of solutions A sequence of question to a little bird about a solutionforms a tree of possible answers.

  27. Bird/Friend - Best of the Best Consider our instance I. Consider the set of solutions k But we only care aboutthe first bird answer. k The answers classifiesthe possible solutions. Solutions consistent with the kth bird' s answer.

  28. Bird/Friend - Best of the Best optS<I,k> Consider our instance I. Consider the set of solutions Define optS<I,k> to be: the optimum solution for instance I consistent with the kth bird' s answer. Do this for each k. k Solutions consistent with the kth bird' s answer.

  29. Bird/Friend - Best of the Best optS<I,k> Consider our instance I. optS[I] Consider the set of solutions Define optS<I,k> to be: the optimum solution for instance I consistent with the kth bird' s answer. Do this for each k. kmax k Let kmax be the bird' s answergiving the best optS<I,k>. optS[I] = optS<I,k >= Bestk optS<I,k > max

  30. Bird/Friend - Best of the Best Constructing optS<I,k> : the optimum solution for instance I consistent with the kth bird' s answer. Given my instance I. I ask my little bird foran answer k. I ask my friend for his solution. I combine them.

  31. Recursive backtracking code always has thissame basic structure.

  32. Be clear what are • the instances • it’s solution • the cost of a sol.

  33. Loop through the bird answers. Be clear which is the current one being tried.

  34. Give the bird & friend algorithmas a comment. (Unless it is in an earlier question.)

  35. What is the bird asked? What does she answer?

  36. Get help from friend Be clear what sub-instance you give him. Store the solution& cost he gives you.

  37. How do you formyour solution from the friend’s and fromthe bird’s?

  38. How do you formyour cost from the friend’s and fromthe bird’s?

  39. optSolkis a best solutionfor our instance from amongstthose consistent with the bird'skth answer. Take the bestof the best

  40. Return the solutionand cost for the original instance.

  41. Base Cases:Instances that are too small to have smaller instances to give to friends. What are these? What are their solutionsand costs?

  42. i.e. for a path from s to t to be optimal,the sub-path from vi to t must optimal. Optimal Substructure In order to be able to design a recursive backtracking algorithm for a computational problem, the problem needs to have a recursive structure, If  shorter from vi to t.  shorter to s to t.

  43. i.e. for a path from s to t to be optimal,the sub-path from vi to t must optimal. Optimal Substructure In order to be able to design a recursive backtracking algorithm for a computational problem, the problem needs to have an optimal substructure, And finding such a sub-path is a sub-instance of the samecomputational problem.

  44. Optimal Substructure • Optimal substructure means that • Every optimal solution to a problem contains... • ...optimal solutions to subproblems • Optimal substructure does not mean that • If you have optimal solutions to all subproblems... • ...then you can combine any of them to get an optimal solution to a larger problem. • Example: In Canadian coinage, • The optimal solution to 7¢ is 5¢ + 1¢ + 1¢, and • The optimal solution to 6¢ is 5¢ + 1¢, but • The optimal solution to 13¢ is not 5¢ + 1¢ + 1¢ + 5¢ + 1¢ • But there is some way of dividing up 13¢ into subsets with optimal solutions (say, 11¢ + 2¢) that will give an optimal solution for 13¢ • Hence, the making change problem exhibits optimal substructure.

  45. Optimal Substructure Don’t all problems have this optimal substructure property?

  46. Consider the following graph: The longest simple path (path not containing a cycle) from A to D is A B C D However, the subpath A B is not the longest simple path from A to B (A C B is longer) The principle of optimality is not satisfied for this problem Hence, the longest simple path problem cannot be solved by a dynamic programming approach B 1 2 3 1 4 C D A Longest simple path Optimal Substructure NP-Complete

  47. Same as Brute Force Algorithm I try each edge out of s. A friend tries each edge out of these. A friend tries each edge out of these. Time? Same as the brute force algorithm that tries each path.

  48. Same as Brute Force Algorithm But there may be an exponential number of paths!

  49. Speeding Up the Time Why do all this work with birds & friends? • How else would you iterate through all paths? • But sometimes we can exploit the structure to speed up the algorithm.

  50. Speeding Up the Time Sometimes entire an branch can be pruned off. • Perhaps because these solutions are not valid or not highly valued. • Or because there is at least one optimal solution elsewhere in the tree. • A Greedy algorithm prunes off all branches except the one that looks best.

More Related