1 / 20

Chapter 7- Local Search part 2

Chapter 7- Local Search part 2. Ryan Kinworthy CSCE 990-06 Advanced Constraint Processing. Outline. Chapter Introduction Greedy Local Search (SLS) Random Walk Properties of Local Search Empirical Evaluation Hybrids of Local Search and Inference Effects of Constraint Propagation on SLS

crevan
Download Presentation

Chapter 7- Local Search part 2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 7- Local Search part 2 Ryan Kinworthy CSCE 990-06 Advanced Constraint Processing

  2. Outline • Chapter Introduction • Greedy Local Search (SLS) • Random Walk • Properties of Local Search • Empirical Evaluation • Hybrids of Local Search and Inference • Effects of Constraint Propagation on SLS • Local Search on Cycle-Cutset • Chapter Summary

  3. Example(2): Simulated Annealing • Uses a noise model from statistical mechanics • At each step, the algorithm computes the change in the cost function () when the value of the variable is changed to the value picked. • If the change improves or doesn’t affect the cost function the change is made • Otherwise the change is made with probability e-t where T is Temperature • Temp can be held constant, or slowly reduced from a high temp to a low temp according to some schedule. This algorithm converges to the exact solution if the temperature T is reduced gradually.

  4. Empirical Evaluation • Evaluating Local Search Algorithms • Empirically • Use either benchmarks or randomly generated problems • Recent trends are to generate hard random problems from phase transition • Empirical evaluation of GSAT algorithms • As the number of variables and clauses increases, the running time of the algorithms increases • SLS algorithms listed in order of efficiency (most to least) • Simulated Annealing • With Random Walk with Noise • with Random Walk • Basic GSAT

  5. Hybrids of Local Search & Inference • Inference with general search is very effective, how well will inference work with local search? • Effects of constraint propagation on SLS • Certain classes of problems, easy for BT, are very hard for SLS • Example: certain variations of 3SAT are extremely hard for SLS. • However, when inference methods are combined with SLS, these 3SAT problems become trivial

  6. Why Is Inference so Effective for Local Search? • Not definitively explained • Conjectures given • Enforcing local consistency eliminates many “near” solutions and thusly reduces the search space. • That is, assignments that satisfy almost all clauses whose cost function would normally be near zero become high cost due to consistency enforcing. • Doesn’t always hold • Problems with uniform structure perform worse with inference • Application of combining inference and SLS • Cycle-cutset

  7. What is Cycle-Cutset? • First description in Chapter 5 (pp. 146) • Definition: Given an undirected graph, a subset of nodes in the graph is a cycle-cutset if its removal results in a graph with no cycles. • The Cycle-Cutset scheme alternates between two algorithms • BT search on the cutset portion • Tree inference on the rest • Key benefits • Once a variable is instantiated, it can be removed from the constraint graph. • If the set of instantiated variables forms a cycle-cutset, then we know that the remaining nodes in the graph form a tree and we can use directional rather than full consistency algorithms to solve it. • Applicable to Local Search/Inference hybrids • If we can guarantee that the constraint graph is a tree, we can then use directional arc-consistency as an inference method for solving the constraint graph. • Example on page 206

  8. Local Search on Cycle-Cutset • Instantiated variables cut the flow of information on any path they are on. • In other words, the network is equivalent to one in which the instantiated variable is deleted from the network and the influence of its value is propagated to all neighboring nodes. • So when the group of instantiated variables removes all cycles in the graph, the remaining network can be viewed as a tree and consequently can be solved by a tree-inference algorithm (e.g., arc-consistency). • Complexity • Can be bounded exponentially in the size of the cutset. • Analysis not specifically given, but I assume it to be NP-Hard

  9. Hybrid Local Search on Cycle-Cutset • Where does SLS fit in? • Since SLS approximates search, it replaces BT search • A mechanism for collaboration in hybrids • Tree Algorithm works for networks with cycles • Any assignment it produces will minimize the number of violated constraints across all its subnetworks

  10. Tree Algorithm • Input: • An arc consistent network R • Variables X partitioned, X = Z U Y • into cycle-cutset Y and • tree variables Z • An assignment Y = y. • Output: • An assignment Z = z that minimizes the #violated constraints of the entire network when Y = y.

  11. Tree Algorithm (cont.) • Initialization • For any value y[i] of any cutset variable yi, the cost Cyi (y[i],y) is 0. • Algorithm body • 1) Going from leaves to root in the tree • For every variable zi, and any value aiin Dzi, compute the cost of each assignment • 2) Compute, from root to leaves new assignments for every tree variable zi • For a tree variable zi, let Dzi be its consistent values with vpithe value assigned to its parent pi, assign each variable a value based on the results of the previous calculation

  12. Tree Algorithm (AAAI’96) • More clearly explained in Dechter and Kask’s paper (today’s handout) • A Graph-Based Method for Improving GSAT • Appeared in the AAAI’96 conference proceedings • Tree Algorithm • Generalization of Mackworth & Freuder ’85 • Generalized to work with cyclic networks • In acyclic (tree) networks functions the same as M&F • In cyclic networks • Finds an assignment that minimizes the sum of unsatisfied constraints over all its tree subnetworks

  13. Tree Algorithm (AAAI’96) cont. • Example on board • Still unclear on how the weight of a constraint is calculated • Discussion

  14. SLS With Cycle-Cutset • Benefit of using tree algorithm • Minimizes the cost of tree subnetworks given a cycle-cutset assignment • This means we can replace BT search with an SLS search • Need to combine the tree algorithm with SLS • Results in a concrete algorithm: • SLS + CC

  15. Overview of SLS + CC • Algorithm executes a certain #tries • For each try • Start from a random initialization • Alternate between SLS and tree algorithm • SLS chooses initial assignment for Y variables • TA find min cost of assignment to Z variables • SLS fix Z, choose best y, fix y • TA find best assignment for Z, fix z • SLS on Y… • TA on Z… • Note: • Only adjacent tree variables affect the behavior of SLS • The algorithm must enforce this property • Otherwise the performance of SLS + CC deteriorates by several orders of magnitude • Cycle-Cutset idea can be generalized • SLS + CC is an algorithm specific to the case when w* = 1(tree) • In cases where w* >1, SLS cannot be used and a general backtracking search must be used instead

  16. SLS + CC • Input: • An arc consistent network R • Variables X partitioned, X = Z  Y • into cycle cutset Y and • tree variables Z • Output: • An assignment Z = z, Y = y that is a local minimum of the #violated constraints C(z,y)

  17. SLS + CC (cont.) • Repeat MAX_TRIES times • Algorithm body • 1) Random initial assignment for all variables • 2) Alternate between these steps until problem is solved, or the TA doesn’t change the values of the variables or no progress is made • a) When the values of cycle-cutset variables are fixed, run TA on the Z variables • b) When the values of tree variables are fixed, run SLS on the Y variables

  18. SLS + CC Performance • SLS + CC vs. SLS • Empirical evaluation (pp. 210-211) • For problems where the Cycle-Cutset < 30% of the variables: • SLS + CC can solve 3-4 times more problems then SLS alone given equal CPU time • When Cycle-Cutset  30% • SLS + CC performs about the same as plain SLS • For problems where the Cycle-Cutset > 30% of the variables: • SLS is better than SLS + CC

  19. Summary of Local Search • The good • Significantly faster in some problem domains than BT • Can solve previously unsolvable problems • Does more with less CPU time • Hybrid algorithms even more efficient • The bad • Not complete or sound (doesn’t guarantee a solution) • Not applicable to all domains • Can get stuck in local minima • The ugly • If applied to the wrong domain, can be a waste of time • Code carefully or performance will deteriorate rapidly!!

  20. Discussion • Questions? • Thoughts? • Opinions?

More Related