1 / 112

Soft constraint processing

These slides are provided as a teaching support for the community. They can be freely modified and used as far as the original authors (T. Schiex and J. Larrosa) contribution is clearly mentionned and visible and that any modification is acknowledged by the author of the modification.

afram
Download Presentation

Soft constraint processing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. These slides are provided as a teaching support for the community. They can be freely modified and used as far as the original authors (T. Schiex and J. Larrosa) contribution is clearly mentionned and visible and that any modification is acknowledged by the author of the modification. Soft constraint processing Thomas Schiex INRA – Toulouse France Javier Larrosa UPC – Barcelona Spain

  2. Overview • Frameworks • Generic and specific • Algorithms • Search: complete and incomplete • Inference: complete and incomplete • Integration with CP • Soft as hard • Soft as global constraint CP06

  3. Parallel mini-tutorial • CSP  SAT strong relation • Along the presentation, we will highlight the connections with SAT • Multimedia trick: • SAT slides in yellow background CP06

  4. Why soft constraints? • CSP framework: natural for decision problems • SAT framework: natural for decision problems with boolean variables • Many problems are constrained optimization problems and the difficulty is in the optimization part CP06

  5. Why soft constraints? • Earth Observation Satellite Scheduling • Given a set of requested pictures (of different importance)… • … select the best subset of compatible pictures … • … subject to available resources: • 3 on-board cameras • Data-bus bandwith, setup-times, orbiting • Best = maximize sum of importance CP06

  6. Why soft constraints? • Frequency assignment • Given a telecommunication network • …find the best frequency for each communication link avoiding interferences • Best can be: • Minimize the maximum frequency (max) • Minimize the global interference (sum) CP06

  7. b3 v3 G7 G5 G2 b4 v4 G6 b1 v1 G1 G8 G3 G4 b2 v2 Why soft constraints? • Combinatorial auctions • Given a set G of goods and a set B of bids… • Bid (bi,vi), bi requested goods, vi value • … find the best subset of compatible bids • Best = maximize revenue (sum) CP06

  8. Why soft constraints? • Probabilistic inference (bayesian nets) • Given a probability distribution defined by a DAG of conditional probability tables • and some evidence … • …find the most probable explanation for the evidence (product) CP06

  9. Why soft constraints? • Even in decision problems: • users may have preferences among solutions Experiment: give users a few solutions and they will find reasons to prefer some of them. CP06

  10. Observation • Optimization problems are harder than satisfaction problems CSP vs. Max-CSP CP06

  11. Why is it so hard ? Proof of inconsistency Problem P(alpha): is there an assignment of cost lower than alpha ? Proof of optimality Harder than finding an optimum CP06

  12. Notation • X={x1,..., xn} variables (n variables) • D={D1,..., Dn} finite domains (max size d) • Z⊆Y⊆X, • tYis a tuple on Y • tY[Z] is its projection on Z • tY[-x] = tY[Y-{x}] is projecting out variable x • fY: ∏xi∊YDiE is a cost function on Y CP06

  13. Generic and specific frameworks Valued CN weighted CN Semiring CN fuzzy CN …

  14. Costs (preferences) • E costs (preferences) set • ordered by≼ • if a ≼ b then a is better than b • Costs are associated to tuples • Combined with a dedicated operator • max: priorities • +: additive costs • *: factorized probabilities…  Fuzzy/possibilistic CN Weighted CN Probabilistic CN, BN CP06

  15. Soft constraint network (CN) • (X,D,C) • X={x1,..., xn} variables • D={D1,..., Dn} finite domains • C={f,...}cost functions • fS, fij, fi f∅ scope S,{xi,xj},{xi}, ∅ • fS(t): E (ordered by ≼, ≼T) • Obj. Function: F(X)= fS (X[S]) • Solution: F(t)  T • Task: find optimal solution identity anihilator • commutative • associative • monotonic CP06

  16. Specific frameworks CP06

  17. Weighted Clauses • (C,w) weighted clause • C disjunction of literals • w cost of violation • w E(ordered by ≼, ≼T) •  combinator of costs • Cost functions = weighted clauses (xiνxj, 6), (¬xiνxj, 2), (¬xiν¬xj, 3) CP06

  18. Soft CNF formula • F={(C,w),…} Set of weighted clauses • (C, T) mandatory clause • (C, w<T) non-mandatory clause • Valuation: F(X)=  w (aggr. of unsatisfied) • Model: F(t)  T • Task: find optimal model CP06

  19. Specific weighted prop. logics CP06

  20. CSP example (3-coloring) x3 x1 x2 x4 For each edge: (hard constr.) x5 CP06

  21. Weighted CSP example ( = +) For each vertex x3 x1 x2 x4 x5 F(X): number of non blue vertices CP06

  22. Possibilistic CSP example (=max) For each vertex x3 x1 x2 x4 x5 F(X): highest color used (b<g<r) CP06

  23. Some important details • T = maximum acceptable violation. • Empty scope soft constraint f (a constant) • Gives an obvious lower bound on the optimum • If you do not like it: f =  Additional expression power CP06

  24. Weighted CSP example ( = +) For each vertex T=6 T=3 x3 f = 0 x1 x2 x4 For each edge: x5 F(X): number of non blue vertices Optimal coloration with less than 3 non-blue CP06

  25. General frameworks and cost structures lattice ordered idempotent Valued CSP fair multiple hard {,T} Semiring CSP multi criteria totally ordered CP06

  26. Idempotency a  a = a (for any a) For any fSimplied by(X,D,C) (X,D,C) ≡ (X,D,C∪{fS}) • Classic CN:  = and • Possibilistic CN: = max • Fuzzy CN:  = max≼ • … CP06

  27. Fairness • Ability to compensate for cost increases by subtraction using a pseudo-difference: For b ≼ a, (a ⊖ b)  b = a • Classic CN: a⊖b = or (max) • Fuzzy CN: a⊖b = max≼ • Weighted CN: a⊖b = a-b (a≠T) else T • Bayes nets: a⊖b = / • … CP06

  28. Processing Soft constraints Search complete (systematic) incomplete (local) Inference complete (variable elimination) incomplete (local consistency)

  29. Systematic search Branch and bound(s)

  30. I - Assignment (conditioning) f[xi=b] g(xj) g[xj=r] h CP06

  31. I - Assignment (conditioning) {(xyz,3), (¬xy,2)} x=true y=false (y,2) (,2) • empty clause. It cannot be satisfied, 2 is necessary cost CP06

  32. Systematic search Each node is a soft constraint subproblem variables (LB) Lower Bound = f under estimation of the best solution in the sub-tree If  then prune f LB UB T = T (UB) Upper Bound = best solution so far CP06

  33. Improving the lower bound (WCSP) • Sum up costs that will necessarily occur (no matter what values are assigned to the variables) • PFC-DAC (Wallace et al. 1994) • PFC-MRDAC (Larrosa et al. 1999…) • Russian Doll Search (Verfaillie et al. 1996) • Mini-buckets (Dechter et al. 1998) CP06

  34. Improving the lower bound (Max-SAT) • Detect independent subsets of mutually inconsistent clauses • LB4a (Shen and Zhang, 2004) • UP (Li et al, 2005) • Max Solver (Xing and Zhang, 2005) • MaxSatz (Li et al, 2006) • … CP06

  35. Local search Nothing really specific

  36. Local search Based on perturbation of solutions in a local neighborhood • Simulated annealing • Tabu search • Variable neighborhood search • Greedy rand. adapt. search (GRASP) • Evolutionary computation (GA) • Ant colony optimization… • See: Blum & Roli, ACM comp. surveys, 35(3), 2003 • For boolean • variables: • GSAT • … CP06

  37. Boosting Systematic Search with Local Search • Do local search prior systematic search • Use best cost found as initial T • If optimal, we just prove optimality • In all cases, we may improve pruning Local search (X,D,C) Sub-optimal solution time limit CP06

  38. Boosting Systematic Search with Local Search • Ex: Frequency assignment problem • Instance: CELAR6-sub4 • #var: 22 , #val: 44 , Optimum: 3230 • Solver: toolbar 2.2 with default options • T initialized to 100000  3 hours • T initialized to 3230  1 hour • Optimized local search can find the optimum in a less than 30” (incop) CP06

  39. Complete inference Variable (bucket) elimination Graph structural parameters

  40. II - Combination (join with , + here)  = 0  6 CP06

  41. III - Projection (elimination) Min f[xi] g[] 0 0 2 0 CP06

  42. Properties • Replacing two functions by their combination preserves the problem • If f is the only function involving variable x, replacing f by f[-x] preserves the optimum CP06

  43. Variable elimination • Select a variable • Sum all functions that mention it • Project the variable out • Complexity • Time: (exp(deg+1)) • Space: (exp(deg)) CP06

  44. Variable elimination (aka bucket elimination) • Eliminate Variables one by one. • When all variables have been eliminated, the problem is solved • Optimal solutions of the original problem can be recomputed • Complexity: exponential in the induced width CP06

  45. Elimination order influence • {f(x,r), f(x,z), …, f(x,y)} • Order: r, z, …, y, x x … r z y CP06

  46. Elimination order influence • {f(x,r), f(x,z), …, f(x,y)} • Order: r, z, …, y, x x … r z y CP06

  47. Elimination order influence • {f(x), f(x,z), …, f(x,y)} • Order: z, …, y, x x … z y CP06

  48. Elimination order influence • {f(x), f(x,z), …, f(x,y)} • Order: z, …, y, x x … z y CP06

  49. Elimination order influence • {f(x), f(x), f(x,y)} • Order: y, x x … y CP06

  50. Elimination order influence • {f(x), f(x), f(x,y)} • Order: y, x x … y CP06

More Related