1 / 115

Soft constraint processing

These slides are provided as a teaching support for the community. They can be freely modified and used as far as the original authors (T. Schiex and J. Larrosa) contribution is clearly mentionned and visible and that any modification is acknowledged by the author of the modification.

gilbert
Download Presentation

Soft constraint processing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. These slides are provided as a teaching support for the community. They can be freely modified and used as far as the original authors (T. Schiex and J. Larrosa) contribution is clearly mentionned and visible and that any modification is acknowledged by the author of the modification. Soft constraint processing Thomas Schiex INRA – Toulouse France Thanks to… Javier Larrosa UPC – Barcelona Spain School organizers Francesca, Michela, Kryzstof…

  2. Overview • Introduction and definitions • Why soft constraints and dedicated algorithms? • Generic and specific models • Handling soft constraint problems • Fundamental operations on soft CN • Solving by search (systematic and local) • Complete and incomplete inference • Hybrid (search+inference) • Polynomial classes • Ressources First international summer school on constraint processing

  3. Why soft constraints? • CSP framework: for decision problems • Many problems are overconstrained or optimization problems • Economics (combinatorial auctions) • Given a set G of goods and a set B of bids… • Bid (Bi,Vi), Bi requested goods, Vi value • … find the best subset of compatible bids • Best = maximize revenue (sum) First international summer school on constraint processing

  4. Why soft constraints? • Satellite scheduling • Spot 5 is an earth observation satellite • It has 3 on-board cameras • Given a set of requested pictures (of different importance)… • Resources, data-bus bandwidth, setup-times, orbiting • …select a subset of compatible pictures with max. importance (sum) First international summer school on constraint processing

  5. Why soft constraints? • Probabilistic inference (bayesian nets) • Given a probability distribution defined by a DAG of conditional probability tables • And some evidence • …find the most probable explanation for the evidence (product) First international summer school on constraint processing

  6. Why soft constraints? • Ressource allocation (frequency assignment) • Given a telecommunication network • …find the best frequency for each communication link avoiding interferences • Best can be: • Minimize the maximum frequency (max) • Minimize the global interference (sum) First international summer school on constraint processing

  7. Why soft constraints • Even in decision problems: • the problem may be unfeasible • users may have preferences among solutions • It happens in most real problems. Experiment: give users a few solutions and they will find reasons to prefer some of them. First international summer school on constraint processing

  8. Notations and definitions • X={x1,..., xn} variables (n variables) • D={D1,..., Dn} finite domains (max size d) • Y⊆X ℓ(Y) = ∏xi∊YDi t∊ℓ(Y) = tY • Z⊆Y tY[Z] = projection on Z • tY[-xi] = tY[Y-{xi}] • Relation R ⊆ ℓ(Y), scope Y, denoted RY • Projection extended to relations First international summer school on constraint processing

  9. Generic and specific models Valued CN Semiring CN Soft as Hard

  10. Combined local preferences • Constraints are local cost functions • Costs combine with a dedicated operator • max: priorities • +: additive costs • *: factorized probabilities… • Goal: find an assignment with the « best » combined cost  Fuzzy/possibilistic CN Weighted CN Probabilistic CN, BN First international summer school on constraint processing

  11. Soft constraint network (CN) • (X,D,C) • X={x1,..., xn} variables • D={D1,..., Dn} finite domains • C={f,...} e cost functions • fS, fij, fi f∅ scope S,{xi,xj},{xi}, ∅ • fS(t): E (ordered by ≼, ≼T) • Obj. Function: F(X)= fS (X[S]) • Solution: F(t)  T • Soft CN: find minimal solution identity annihilator • commutative • associative • monotonic First international summer school on constraint processing

  12. Specific frameworks Lexicographic CN, probabilistic CN… First international summer school on constraint processing

  13. Weighted CSP example ( = +) For each vertex x3 x1 x2 x4 For each edge: x5 F(X): number of non blue vertices First international summer school on constraint processing

  14. Possibilistic constraint network (=max) For each vertex x3 x1 x2 x4 Connected vertices must have different colors x5 F(X): highest color used (b<g<r<y) First international summer school on constraint processing

  15. Some important details • T = maximum violation. • Can be set to a bounded max. violation k • Solution: F(t) < k = Top • Empty scope soft constraint f (a constant) • Gives an obvious lower bound on the optimum • If you do not like it: f =  Additional expression power First international summer school on constraint processing

  16. Weighted CSP example ( = +) For each vertex T=6 T=3 x3 f = 0 x1 x2 x4 For each edge: x5 F(X): number of non blue vertices Optimal coloration with less than 3 non-blue First international summer school on constraint processing

  17. Microstructural graph (binary network) b g r x3 0 1 1 6 6 6 x1 x4 0 1 1 0 1 1 0 1 1 x2 T=6 f=0 x5 0 1 1 First international summer school on constraint processing

  18. Valued CSP and Semiring CSP a + b = b  b preferred Lattice order Semiring CN Order specified by ACI operator + c-semiring Valued CN total order First international summer school on constraint processing

  19. Examples of P.O. structures • Multiple criteria E1, E2 • (v1, v2) ≼ (w1,w2)  v1≼ v2 and w1≼ w2 • If E1, E2 are c-semiring then E1x E2 too • Ei totally ordered, then multiple criteria = multiple valued CN • Set lattice order • Criteria =set of satisfied constraints •  = ∩ + = ∪ (order = ⊇) • c-semiring only First international summer school on constraint processing

  20. Soft constraints as hard constraints • one extra variable xs per cost function fS • all with domain E • fS➙ cS∪{xS} allowing (t,fS(t)) for all t∊ℓ(S) • one variable xC =  xs (global constraint) First international summer school on constraint processing

  21. Soft as Hard (SaH) • Criterion represented as a variable • Multiple criteria = multiple variables • Constraints on/between criteria • Can be used also in soft CN • Extra variables (domains), increased arities • SaH constraints give weak propagation First international summer school on constraint processing

  22. A (more) general picture Soft as Hard semiring lattice set criteria probabilistic fuzzy Idempotent  (a  a = a) lexicographic possibilistic weighted classic Bayes net • HCLP • Partial CSP • General fuzzy CN • … valued First international summer school on constraint processing

  23. Cost function implication fP implied by gQ (fP≼gQ, gQ tighter than fP) iff for all t∊ℓ(P∪Q), fP(t[P])≼gQ(t[Q]) strong implication implies or T=1 (classic case): implied = redundant First international summer school on constraint processing

  24. Idempotent soft CN a  a = a (for any a) For any fS implied by CN (X,D,C) (X,D,C) ≡ (X,D,C∪{fS}) where ≡ means ≼ and ≽ idempotent: implied = redundant (good) = max (total order) (bad) First international summer school on constraint processing

  25. Handling soft CN Fundamental operations - assignment - combination/join - projection

  26. Assignment (conditioning) f[xi=b] g(xj) g[xj=r] h First international summer school on constraint processing

  27. Combination (join with , + here)  = 0  6 First international summer school on constraint processing

  28. Projection (elimination) Min f[xi] g[] 0 0 2 0 First international summer school on constraint processing

  29. Pure systematic search Branch and bound(s) A pause ?

  30. Systematic search Each node is a soft constraint subproblem variables (LB) Lower Bound = f under estimation of the best solution in the sub-tree f If  UB then prune LB = T (UB) Upper Bound = best solution so far First international summer school on constraint processing

  31. Assigning x3 x3 x1 x2 x4 x5 x1 x2 x4 x1 x2 x4 x1 x2 x4 x5 x5 x5 First international summer school on constraint processing

  32. Assign g to X 4 3 g b r x3 0 1 1 6 6 g b r 6 x1 x4 0 T 1 1 0 T 1 1 x2 0 1 1 T T=6 T 1 0 f= Backtrack x5 0 1 1 First international summer school on constraint processing

  33. Depth First Search (DFS) BT(X,D,C) if (X=) then Top :=f else xj := selectVar(X) forallaDjdo fSC s.t. xjS f := f[xj =a] f:= gSC s.t. S= gS if (LB<Top) thenBT(X-{xj},D-{Dj},C) variable heuristics value heuristics improve LB good UB ASAP First international summer school on constraint processing

  34. The three queens First international summer school on constraint processing

  35. Improving the LB Current node (X,D,C): • Unassigned constraints • Assigned constraints • arity 0: f • arity one: ignored lbfc = f  (xi∊X mina∊Di fi(a)) • Gives a stronger LB First international summer school on constraint processing

  36. Three queens • Unary assigned constraints: • pruning • guidance (value ordering) • value deletion First international summer school on constraint processing

  37. Still improving the LB • Assigned constraints (fi,f) • Original constraints Solve Optimal cost  lbfc • Gives a stronger LB • Can be solved beforehand First international summer school on constraint processing

  38. Russian Doll Search • static variable ordering • solves increasingly large subproblems • uses an improved version of the previous LB recursively • May speedup search by several orders of magnitude X1 X2 X3 X4 X5 LightRDS nice CP implementation for unary costs First international summer school on constraint processing

  39. Other approaches • Taking into account unassigned constraints: • PFC-DAC (Wallace et al. 1994) • PFC-MRDAC (Larrosa et al. 1999…) • Integration into the « SaH » schema (Régin et al. 2001) First international summer school on constraint processing

  40. Local search Nothing really specific

  41. Local search Based on perturbation of solutions in a local neighborhood • Simulated annealing • Tabu search • Variable neighborhood search • Greedy rand. adapt. search (GRASP) • Evolutionary computation (GA) • Ant colony optimization… • See: Blum & Roli, ACM comp. surveys, 35(3), 2003 First international summer school on constraint processing

  42. Boosting Systematic Search with Local Search • Do local search prior systematic search • Use best cost found as initial T • If optimal, we just prove optimality • In all cases, we may improve pruning Local search (X,D,C) Sub-optimal solution time limit First international summer school on constraint processing

  43. Boosting Systematic Search with Local Search • Ex: Frequency assignment problem • Instance: CELAR6-sub4 • #var: 22 , #val: 44 , Optimum: 3230 • Solver: toolbar with default options • UB initialized to 100000  3 hours • UB initialized to 3230  1 hour • Optimized local search can find the optimum in a few minutes • Other combination: use systematic search in large neighborhoods First international summer school on constraint processing

  44. Complete inference What is it Variable (bucket) elimination Graph structural parameters

  45. Inference… • Automated production of implied constraints Cost function implication • Complete… • Is able to produce the strongest implied constraint on the empty scope • Gives the optimal cost • Classic case: detects infeasibility First international summer school on constraint processing

  46. Variable elimination (aka bucket elimination) • Solves the problem by a sequence of problem transformations • Each step yields a problem with one less variable and the same optimum. • When all variables have been eliminated, the problem is solved • Optimal solutions of the original problem can be recomputed First international summer school on constraint processing

  47. Variable/bucket elimination • Select a variable xi • Compute the set Ki of constraints that involves this variable • Add • Remove variable and Ki • Time: (exp(degi+1)) • Space: (exp(degi)) X1 X5 C X4 X2 X3 First international summer school on constraint processing

  48. Three queens…. combine First international summer school on constraint processing

  49. Three queens… eliminate, add First international summer school on constraint processing

  50. Three queens First international summer school on constraint processing

More Related