soft constraint processing n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Soft constraint processing PowerPoint Presentation
Download Presentation
Soft constraint processing

Loading in 2 Seconds...

play fullscreen
1 / 115

Soft constraint processing - PowerPoint PPT Presentation


  • 171 Views
  • Uploaded on

These slides are provided as a teaching support for the community. They can be freely modified and used as far as the original authors (T. Schiex and J. Larrosa) contribution is clearly mentionned and visible and that any modification is acknowledged by the author of the modification.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Soft constraint processing' - gilbert


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
soft constraint processing

These slides are provided as a teaching support for the community. They can be freely modified and used as far as the original authors (T. Schiex and J. Larrosa) contribution is clearly mentionned and visible and that any modification is acknowledged by the author of the modification.

Soft constraint processing

Thomas Schiex

INRA – Toulouse

France

Thanks to…

Javier Larrosa

UPC – Barcelona

Spain

School organizers

Francesca, Michela, Kryzstof…

overview
Overview
  • Introduction and definitions
    • Why soft constraints and dedicated algorithms?
    • Generic and specific models
  • Handling soft constraint problems
    • Fundamental operations on soft CN
    • Solving by search (systematic and local)
    • Complete and incomplete inference
    • Hybrid (search+inference)
    • Polynomial classes
  • Ressources

First international summer school on constraint processing

why soft constraints
Why soft constraints?
  • CSP framework: for decision problems
  • Many problems are overconstrained or optimization problems
  • Economics (combinatorial auctions)
    • Given a set G of goods and a set B of bids…
      • Bid (Bi,Vi), Bi requested goods, Vi value
    • … find the best subset of compatible bids
    • Best = maximize revenue (sum)

First international summer school on constraint processing

why soft constraints1
Why soft constraints?
  • Satellite scheduling
    • Spot 5 is an earth observation satellite
    • It has 3 on-board cameras
    • Given a set of requested pictures (of different importance)…
      • Resources, data-bus bandwidth, setup-times, orbiting
    • …select a subset of compatible pictures with max. importance (sum)

First international summer school on constraint processing

why soft constraints2
Why soft constraints?
  • Probabilistic inference (bayesian nets)
    • Given a probability distribution defined by a DAG of conditional probability tables
    • And some evidence
    • …find the most probable explanation for the evidence (product)

First international summer school on constraint processing

why soft constraints3
Why soft constraints?
  • Ressource allocation (frequency assignment)
    • Given a telecommunication network
    • …find the best frequency for each communication link avoiding interferences
    • Best can be:
      • Minimize the maximum frequency (max)
      • Minimize the global interference (sum)

First international summer school on constraint processing

why soft constraints4
Why soft constraints
  • Even in decision problems:
    • the problem may be unfeasible
    • users may have preferences among solutions
  • It happens in most real problems.

Experiment: give users a few solutions and they will find reasons to prefer some of them.

First international summer school on constraint processing

notations and definitions
Notations and definitions
  • X={x1,..., xn} variables (n variables)
  • D={D1,..., Dn} finite domains (max size d)
  • Y⊆X ℓ(Y) = ∏xi∊YDi t∊ℓ(Y) = tY
  • Z⊆Y tY[Z] = projection on Z
  • tY[-xi] = tY[Y-{xi}]
  • Relation R ⊆ ℓ(Y), scope Y, denoted RY
  • Projection extended to relations

First international summer school on constraint processing

generic and specific models

Generic and specific models

Valued CN

Semiring CN

Soft as Hard

combined local preferences
Combined local preferences
  • Constraints are local cost functions
  • Costs combine with a dedicated operator
    • max: priorities
    • +: additive costs
    • *: factorized probabilities…
  • Goal: find an assignment with the « best » combined cost

Fuzzy/possibilistic CN

Weighted CN

Probabilistic CN, BN

First international summer school on constraint processing

soft constraint network cn
Soft constraint network (CN)
  • (X,D,C)
    • X={x1,..., xn} variables
    • D={D1,..., Dn} finite domains
    • C={f,...} e cost functions
      • fS, fij, fi f∅ scope S,{xi,xj},{xi}, ∅
      • fS(t): E (ordered by ≼, ≼T)
  • Obj. Function: F(X)= fS (X[S])
  • Solution: F(t)  T
  • Soft CN: find minimal solution

identity

annihilator

  • commutative
  • associative
  • monotonic

First international summer school on constraint processing

specific frameworks
Specific frameworks

Lexicographic CN, probabilistic CN…

First international summer school on constraint processing

weighted csp example
Weighted CSP example ( = +)

For each vertex

x3

x1

x2

x4

For each edge:

x5

F(X): number of non blue vertices

First international summer school on constraint processing

possibilistic constraint network max
Possibilistic constraint network (=max)

For each vertex

x3

x1

x2

x4

Connected vertices must

have different colors

x5

F(X): highest color used (b<g<r<y)

First international summer school on constraint processing

some important details
Some important details
  • T = maximum violation.
    • Can be set to a bounded max. violation k
    • Solution: F(t) < k = Top
  • Empty scope soft constraint f (a constant)
    • Gives an obvious lower bound on the optimum
    • If you do not like it: f = 

Additional expression power

First international summer school on constraint processing

weighted csp example1
Weighted CSP example ( = +)

For each vertex

T=6

T=3

x3

f = 0

x1

x2

x4

For each edge:

x5

F(X): number of non blue vertices

Optimal coloration with less than 3 non-blue

First international summer school on constraint processing

microstructural graph binary network
Microstructural graph (binary network)

b g r

x3

0

1

1

6

6

6

x1

x4

0

1

1

0

1

1

0

1

1

x2

T=6

f=0

x5

0

1

1

First international summer school on constraint processing

valued csp and semiring csp
Valued CSP and Semiring CSP

a + b = b  b preferred

Lattice order

Semiring CN

Order specified by ACI operator +

c-semiring

Valued CN

total order

First international summer school on constraint processing

examples of p o structures
Examples of P.O. structures
  • Multiple criteria E1, E2
    • (v1, v2) ≼ (w1,w2)  v1≼ v2 and w1≼ w2
    • If E1, E2 are c-semiring then E1x E2 too
    • Ei totally ordered, then

multiple criteria = multiple valued CN

  • Set lattice order
    • Criteria =set of satisfied constraints
    •  = ∩ + = ∪ (order = ⊇)
    • c-semiring only

First international summer school on constraint processing

soft constraints as hard constraints
Soft constraints as hard constraints
  • one extra variable xs per cost function fS
  • all with domain E
  • fS➙ cS∪{xS} allowing (t,fS(t)) for all t∊ℓ(S)
  • one variable xC =  xs (global constraint)

First international summer school on constraint processing

soft as hard sah
Soft as Hard (SaH)
  • Criterion represented as a variable
  • Multiple criteria = multiple variables
  • Constraints on/between criteria
  • Can be used also in soft CN
  • Extra variables (domains), increased arities
  • SaH constraints give weak propagation

First international summer school on constraint processing

a more general picture
A (more) general picture

Soft

as

Hard

semiring

lattice

set criteria

probabilistic

fuzzy

Idempotent 

(a  a = a)

lexicographic

possibilistic

weighted

classic

Bayes net

  • HCLP
  • Partial CSP
  • General fuzzy CN

valued

First international summer school on constraint processing

cost function implication
Cost function implication

fP implied by gQ (fP≼gQ, gQ tighter than fP)

iff for all t∊ℓ(P∪Q), fP(t[P])≼gQ(t[Q])

strong implication

implies

or

T=1 (classic case): implied = redundant

First international summer school on constraint processing

idempotent soft cn
Idempotent soft CN

a  a = a (for any a)

For any fS implied by CN (X,D,C)

(X,D,C) ≡ (X,D,C∪{fS})

where ≡ means ≼ and ≽

idempotent: implied = redundant (good)

= max (total order) (bad)

First international summer school on constraint processing

handling soft cn

Handling soft CN

Fundamental operations

- assignment

- combination/join

- projection

assignment conditioning
Assignment (conditioning)

f[xi=b]

g(xj)

g[xj=r]

h

First international summer school on constraint processing

combination join with here
Combination (join with , + here)

= 0

 6

First international summer school on constraint processing

projection elimination
Projection (elimination)

Min

f[xi]

g[]

0

0

2

0

First international summer school on constraint processing

pure systematic search

Pure systematic search

Branch and bound(s)

A pause ?

systematic search
Systematic search

Each node is a soft constraint subproblem

variables

(LB) Lower Bound

= f

under estimation of the best

solution in the sub-tree

f

If  UB then prune

LB

= T

(UB) Upper Bound

= best solution so far

First international summer school on constraint processing

assigning x 3
Assigning x3

x3

x1

x2

x4

x5

x1

x2

x4

x1

x2

x4

x1

x2

x4

x5

x5

x5

First international summer school on constraint processing

assign g to x
Assign g to X

4

3

g

b

r

x3

0

1

1

6

6

g

b

r

6

x1

x4

0

T

1

1

0

T

1

1

x2

0

1

1

T

T=6

T

1

0

f=

Backtrack

x5

0

1

1

First international summer school on constraint processing

depth first search dfs
Depth First Search (DFS)

BT(X,D,C)

if (X=) then Top :=f

else

xj := selectVar(X)

forallaDjdo

fSC s.t. xjS f := f[xj =a]

f:= gSC s.t. S= gS

if (LB<Top) thenBT(X-{xj},D-{Dj},C)

variable heuristics

value heuristics

improve LB

good UB ASAP

First international summer school on constraint processing

the three queens
The three queens

First international summer school on constraint processing

improving the lb
Improving the LB

Current node (X,D,C):

  • Unassigned constraints
  • Assigned constraints
    • arity 0: f
    • arity one: ignored

lbfc = f  (xi∊X mina∊Di fi(a))

  • Gives a stronger LB

First international summer school on constraint processing

three queens
Three queens
  • Unary assigned constraints:
  • pruning
  • guidance (value ordering)
  • value deletion

First international summer school on constraint processing

still improving the lb
Still improving the LB
  • Assigned constraints (fi,f)
  • Original constraints

Solve

Optimal cost

 lbfc

  • Gives a stronger LB
  • Can be solved beforehand

First international summer school on constraint processing

russian doll search
Russian Doll Search
  • static variable ordering
  • solves increasingly large subproblems
  • uses an improved version of the previous LB recursively
  • May speedup search by several orders of magnitude

X1

X2

X3

X4

X5

LightRDS

nice CP implementation for unary costs

First international summer school on constraint processing

other approaches
Other approaches
  • Taking into account unassigned constraints:
    • PFC-DAC (Wallace et al. 1994)
    • PFC-MRDAC (Larrosa et al. 1999…)
    • Integration into the « SaH » schema (Régin et al. 2001)

First international summer school on constraint processing

local search

Local search

Nothing really specific

local search1
Local search

Based on perturbation of solutions in a local neighborhood

  • Simulated annealing
  • Tabu search
  • Variable neighborhood search
  • Greedy rand. adapt. search (GRASP)
  • Evolutionary computation (GA)
  • Ant colony optimization…
  • See: Blum & Roli, ACM comp. surveys, 35(3), 2003

First international summer school on constraint processing

boosting systematic search with local search
Boosting Systematic Search with Local Search
  • Do local search prior systematic search
  • Use best cost found as initial T
    • If optimal, we just prove optimality
    • In all cases, we may improve pruning

Local search

(X,D,C)

Sub-optimal

solution

time

limit

First international summer school on constraint processing

boosting systematic search with local search1
Boosting Systematic Search with Local Search
  • Ex: Frequency assignment problem
    • Instance: CELAR6-sub4
      • #var: 22 , #val: 44 , Optimum: 3230
    • Solver: toolbar with default options
    • UB initialized to 100000  3 hours
    • UB initialized to 3230  1 hour
      • Optimized local search can find the optimum in a few minutes
  • Other combination: use systematic search in large neighborhoods

First international summer school on constraint processing

complete inference

Complete inference

What is it

Variable (bucket) elimination

Graph structural parameters

inference
Inference…
  • Automated production of implied constraints Cost function implication
  • Complete…
    • Is able to produce the strongest implied constraint on the empty scope
      • Gives the optimal cost
      • Classic case: detects infeasibility

First international summer school on constraint processing

variable elimination aka bucket elimination
Variable elimination (aka bucket elimination)
  • Solves the problem by a sequence of problem transformations
  • Each step yields a problem with one less variable and the same optimum.
  • When all variables have been eliminated, the problem is solved
  • Optimal solutions of the original problem can be recomputed

First international summer school on constraint processing

variable bucket elimination
Variable/bucket elimination
  • Select a variable xi
  • Compute the set Ki of constraints that involves this variable
  • Add
  • Remove variable and Ki
  • Time: (exp(degi+1))
  • Space: (exp(degi))

X1

X5

C

X4

X2

X3

First international summer school on constraint processing

three queens combine
Three queens…. combine

First international summer school on constraint processing

three queens eliminate add
Three queens… eliminate, add

First international summer school on constraint processing

three queens1
Three queens

First international summer school on constraint processing

elimination order influence
Elimination order influence
  • Order G,D,F,B,C,A
  • Order G,B,C,D,F,A

First international summer school on constraint processing

induced width
Induced width
  • For G=(V,E) and a given elimination (vertex) ordering, the largest degree encountered is

the induced width

of the ordered graph

  • Minimizing induced width (dimension) is NP-hard.
  • Heuristics, better criteria, hypergraphs

First international summer school on constraint processing

history terminology
History / terminology
  • Non serial dynamic programming (Bertelé Brioschi, 72) for cost functions (VE, block)
  • Acyclic DB (Beeri et al 1983)
  • Bayesian nets (Pearl 88, Lauritzen et Spiegelhalter 88)
  • Constraint nets (Dechter and Pearl 88)
  • Semirings (Shenoy and Shafer 90)
  • C-semirings (Bistarelli et al. 95)…

First international summer school on constraint processing

real example celar scen06
Real example (CELAR SCEN06)

First international summer school on constraint processing

incomplete inference

Incomplete inference

Mini buckets

Local consistency (idempotent  or …)

incomplete inference1
Incomplete inference
  • Tries to trade completeness for space/time
    • Produces only specific classes of implied constraints
    • Often in order to produce a lb on the optimum
    • And usually in polynomial time/space
    • Keep these constraints outside: mini-buckets
    • Add them to the network: local consistency
      • directly (idempotent: implied = redundant)
      • Try to accomodate for non idempotency

First international summer school on constraint processing

mini buckets dechter 97
Mini buckets (Dechter 97)
  • Eliminate x vs. « mini-eliminate » x
    • Repeat: produces a lower bound (and an upper bound)
    • Complexity/strength controlled by the # of variables in each mini bucket.

x

x

First international summer school on constraint processing

local consistency
Local consistency
  • Very useful in classical CSP
    • Node consistency
    • Arc consistency…
  • Produces an equivalent more explicit problem:
    • Same optimum (consistency): gives a lb (may detect unfeasibility)
    • Compositional: can be synergetically combined with other techniques « transparently »

First international summer school on constraint processing

classical arc consistency binary csp
Classical arc consistency (binary CSP)
  • for any xi and cij
    • c=(cij⋈cj)[xi] brings no new information on xi

cij⋈cj

(cij⋈cj)[-xj]

v

v

T

0

0

T

0

w

w

0

T

i

j

First international summer school on constraint processing

arc consistency and soft constraints
Arc consistency and soft constraints
  • for any xi and fij
    • f=(fij fj)[xi] brings no new information on xi

fij fj

(fij fj)[xi]

v

v

1

0

0

0.5

0

w

w

0

0.5

i

j

Always equivalent iff idempotent

First international summer school on constraint processing

idempotent soft cn bistarelli et al 97
Idempotent soft CN (Bistarelli et al. 97)
  • The previous operational extension works on any idempotent semiring CN
    • Repeated application of the local enforcing process until quiescence
    • Terminates and yields an equivalent problem
    • Extends to non binary constraints, to higher level of consistency (k-consistency)
    • Total order: idempotent  ( = max)

First international summer school on constraint processing

non idempotent weighted cn
Non idempotent: weighted CN
  • for any xi and fij
    • f=(fij fj)[xi] brings no new information on xi

fij fj

fij fj [xi]

v

v

2

0

0

1

0

w

w

0

1

i

j

EQUIVALENCE LOST

First international summer school on constraint processing

extraction of implied constraints
Extraction of implied constraints
  • Combination+Extraction: equivalence preserving transformation
  • Requires the existence of a pseudo difference (a b)  b = a: fair valued CN.

fij fj

fij fj [xi]

v

v

2

1

0

0

1

0

0

w

w

1

i

j

First international summer school on constraint processing

fair and unfair
Fair and unfair
  • Classic CN: a⊖b = or (max)
  • Fuzzy CN: a⊖b = max≼
  • Weighted CN: a⊖b = a-b (a≠T) else T
  • Bayes nets: a⊖b = /
  • (N∪{,T},≽)
    • n years of prison
    • life imprisonment
    • death penalty
  • m  n=m+n
  •   n = 
  •    = T

Set of differences of  and  is N, no maximum difference

First international summer school on constraint processing

k y equivalence preserving inference
(K,Y) equivalence preserving inference
  • For a set K of constraints and a scope Y
    • Replace K by (K)
    • Add (K)[Y] to the CN (implied by K)
    • Extract (K)[Y] from (K)
  • Yields an equivalent network
  • All implicit information on Y in K is explicit
  • Repeat for a class of (K,Y) until fixpoint

First international summer school on constraint processing

node consistency nc f i epi
Node Consistency (NC*): (fi, ) EPI
  • For any variable Xi
    • a, f+fi (a)<T
    •  a, fi (a)= 0
  • Complexity:

O(nd)

T =

4

C=

0

1

x

v

3

2

z

w

0

2

v

0

1

1

0

1

w

v

0

1

1

w

1

y

First international summer school on constraint processing

full ac fac f ij f j x i epi
Full AC (FAC*): ({fij,fj},xi) EPI
  • NC*
  • For all fij
    • a b

fij(a,b) +fj(b) = 0

(full support)

T=4

C=0

x

1

v

0

1

z

w

0

1

v

0

1

0

1

w

That’s our starting point!

No termination !!!

First international summer school on constraint processing

arc consistency ac f ij x i epi
Arc Consistency (AC*): ({fij},xi) EPI
  • NC*
  • For all fij
    • a b

fij(a,b)= 0

  • b is a support
  • complexity:

O(n 2d 3)

T=4

C=

1

2

x

z

w

0

2

v

0

1

0

w

v

0

1

1

0

1

0

w

1

y

First international summer school on constraint processing

confluence is lost
Confluence is lost

Finding an AC closure that maximizes the lb is an NP-hard problem (Cooper & Schiex 2004)

First international summer school on constraint processing

directional ac dac f ij f j x i i j epi
Directional AC (DAC*): ({fij,fj},xi)i<j EPI
  • NC*
  • For all fij (i<j)
    • a b

fij(a,b) +fj(b) = 0

  • b is a full-support
  • complexity:

O(ed 2)

x<y<z

T=4

C=

2

1

x

v

2

2

z

w

0

2

v

0

1

1

0

w

1

v

0

1

1

1

w

0

1

y

First international summer school on constraint processing

full dac fdac dac ac
Full DAC (FDAC*): DAC + AC
  • NC*
  • For all fij (i<j)
    • a b

fij(a,b) +fj(b) = 0

(full support)

  • For all fij (i>j)
    • a b, fij(a,b) = 0

(support)

  • Complexity: O(end3)

x<y<z

T=4

C=

2

1

x

v

2

2

z

w

0

2

v

1

2

0

1

0

w

v

0

1

1

1

w

0

1

y

First international summer school on constraint processing

implementation example for ac
Implementation example for AC

Data structures

  • A queue Q of variables (pruned domains)
  • S(i,a,j): current support for a∊ Di on fij(AC)
  • S(i): current support for f∅ on xi (NC)

First international summer school on constraint processing

ac 2001 based version
AC 2001 based version

Support for Node consistency

Support for Arc consistency

First international summer school on constraint processing

value deletion cost back propagation
Value deletion (cost back propagation)

First international summer school on constraint processing

the fixpoint loop
The fixpoint loop

First international summer school on constraint processing

space complexity
Space complexity
  • Space can be reduced to O(ed)

O(erd) for arity r (not only tables)

First international summer school on constraint processing

hierarchy
Hierarchy

Special case: CSP (Top=1)

NC

NC*O(nd)

DAC

AC*O(n 2d 3)

DAC*O(ed 2)

AC

FDAC*O(end 3)

EDAC

(ijcai 2005)

First international summer school on constraint processing

other results
Other results
  • AC defined for arbitrary fair structures and arbitrary arities (Schiex 2000, Cooper & Schiex 2004)
  • k-consistency: from all constraints with a scope included in a set of n variables to a subset of n-1 variables (Cooper 2005)
  • Cyclic consistency (Cooper 2004), Existential AC (de Givry et al. 2005): not easily described as EPI.

First international summer school on constraint processing

global soft constraints

Global soft constraints

Using SaH schema

Compare with Soft AC

global soft constraints1
Global soft constraints
  • Existing global soft constraints are «Soft as Hard» (Petit et al. 2000)
  • Three questions
    • What is the semantics of the global constraints
    • What level of consistency is enforced
    • And how ?

First international summer school on constraint processing

the all diff petit et al 2001 van hoeve 2004
The all-diff (Petit et al 2001, van Hoeve 2004)
  • Semantics ?
    • a) number of variables whose value needs to be changed to satisfy the constraint (max n).
    • b) primal graph violations: # of pair of variables with the same value (max n(n-1)/2).
  • Level of consistency: hyper arc (GAC)
  • Algorithm: (minimum cost) maximum flow algorithms on the value graph (with cost).

First international summer school on constraint processing

other soft global constraints
Other soft global constraints
  • Soft global cardinality (van Hoeve et al. 2004)
  • Soft regular (van Hoeve et al. 2004)
  • Other possible semantics for soft constraints (some with NP-hard consistency checks) (Beldiceanu & Petit 2004)

First international summer school on constraint processing

gac on sah vs soft ac
GAC on SaH vs. Soft AC ?

x1

x2

x3

1

a

1

1

f∅=1

1

b

1

1

xc

x1

x2

x3

0

1

2

x12

x23

xc=x12+x23

0

1

0

1

First international summer school on constraint processing

slide84
So…
  • Is it possible…
    • That soft AC is always at least as good as SaH GAC ? (my conjecture is yes)
    • To offer cost variables and nicely integrate soft CN (local consistency, unary constraints for heuristics…) (my conjecture is yes)
    • To improve current global soft constraints handling so that they reach (or exceed) soft AC, DAC, FDAC, EDAC…

First international summer school on constraint processing

hybrid search search inference

Hybrid search (search+inference)

+ variable elimination

+ mini buckets

+ local consistency

boosting search with variable elimination bb ve k
Boosting search with variable elimination: BB-VE(k)
  • At each node
    • Select an unassigned variable xi
    • If degi ≤ k then eliminate xi
    • Else branch on the values of xi
  • Properties
    • BE-VE(-1) is BB
    • BE-VE(w*) is VE
    • BE-VE(1) is similar to cycle-cutset

First international summer school on constraint processing

example bb ve 2
Example BB-VE(2)

First international summer school on constraint processing

example bb ve 21
Example BB-VE(2)

c

c

c

First international summer school on constraint processing

boosting search with variable elimination
Boosting search with variable elimination
  • Ex: still-life (academic problem)
    • Instance: n=14
      • #var:196 , #val:2
    • Ilog Solver  5 days
    • Variable Elimination  1 day
    • BB-VE(18)  2 seconds

Other combination: Backtrack with Tree Decomposition (BTD, Jégou and Terrioux 2003).

First international summer school on constraint processing

use incomplete inference

Use incomplete inference

Lower bound inside BB

using mini buckets kask dechter 99
Using mini buckets (Kask & Dechter 99)
  • Assume a DFS tree search using a static variable order x1…xn
  • Compute mini-buckets (elim. xn…x1)
    • mini-eliminating x generates a family of cost functions fxij = (Kxij)[-xi]
  • When (x1…xp) assigned, any fxij
    • Whose scope is included in (x1…xp)
    • Such that i>p (implied by non completely assigned constraints)

Has a known value non redundant with lb

First international summer school on constraint processing

boosting search with lc
Boosting search with LC

BT(X,D,C)

if (X=) then Top :=f

else

xj := selectVar(X)

forallaDjdo

fSC s.t. xjSfS:= fS [xj =a]

if (LC) then BT(X-{xj},D-{Dj},C)

First international summer school on constraint processing

slide93

MEDAC

MFDAC

MAC/MDAC

MNC

BT

First international summer school on constraint processing

experiments

Experiments

Evaluating bounds/algorithms

Why is WCSP so hard ? (informally)

experimenting with soft cn
Experimenting with soft CN
  • A good lb is a compromise between its strength and its computational cost
    • academic problems (from n-queens to graph coloring)
    • real problems (frequency allocation, pedigree analysis…)
    • random problems (Max CSP, Max SAT, weighted Max SAT)

Phase transitions ?

First international summer school on constraint processing

classic csp
Classic CSP

First international summer school on constraint processing

optimization vs satisfaction
Optimization vs. satisfaction

First international summer school on constraint processing

why is it so hard
Why is it so hard ?

Proof of inconsistency (simplest)

Problem P(alpha): is there an assignment of cost lower than alpha ?

Proof of optimality (hardest)

Harder than finding an optimum

First international summer school on constraint processing

random max csp
Random Max CSP

CPU time

n. of variables

First international summer school on constraint processing

random max 2sat
Random Max-2SAT

PBS, OPBDP,CPLEX use

a natural SaH formulation

First international summer school on constraint processing

frequency assignment
Frequency assignment
  • n communication links
  • set of available frequencies
  • close links must use sufficiently different frequencies (|xi-xj| > dij)
    • Minimize weighted sum of violated constraints
    • CELAR/GRAPH instances: still open problems

First international summer school on constraint processing

boosting systematic search with local consistency
Boosting Systematic Search with Local consistency

Frequency assignment problem CELAR6-sub4

#var: 22 , #val: 44 , #constraints: 477

  • toolbar: MNC*1 year MFDAC*  1 hour

Unweighted

  • PFCMRDAC  154.5”
  • MFDAC* toolbar  36.2”
  • MEDAC* toolbar  4.4”
  • AOMFDAC REES  2574” (SVO)

First international summer school on constraint processing

uncapacited warehouse location
Uncapacited warehouse location
  • We consider n stores and m possible warehouse locations.
  • Each warehouse yields to a cost ce for the maintenance, and cem for the supply of a store.
  • Each store must be supplied by exactly one warehouse.

Which warehouses should be opened?

NP-hard problem.

First international summer school on constraint processing

model
Model
  • n integer variables (stores), domain size m
  • m boolean for candidates warehouses
  • m+n unary cost functions (costs)
  • n.m hard binary constraints relating stores and warehouses

First international summer school on constraint processing

results
Results

First international summer school on constraint processing

complexity polynomial classes

Complexity & Polynomial classes

Tree = induced width 1

Idempotent  or not…

idempotent vcsp max
Idempotent VCSP ( = max)
  • A new operation on soft constraints:
    • The -cut of a soft constraint f is the hard constraint c that forbids t iff f(t) > 
  • Can be applied to a whole soft CN

Cut(f, 0.6)

First international summer school on constraint processing

solving a min max cn by slicing
Solving a min/max CN by slicing
  • Let  be the minimum  s.t. the -cut is consistent.
  • The set of solutions of the (classical) -cut = the set of all optimal solutions of the min/max CN
  • A possibilistic/fuzzy CN can be solved in O(log(# different costs)) CSP.
  • Can be used to lift polynomial classes

Binary search

First international summer school on constraint processing

polynomial classes idempotent vcsp min max cn
Polynomial classesIdempotent VCSP: min-max CN
  • Can use -cuts for lifting CSP classes
    • Sufficient condition: the polynomial class is «conserved» by -cuts
  • Simple TCSP are TCSP where all constraints use 1 interval: xi-xj∊[aij,bij]
  • Fuzzy STCN: any slice of a cost function is an interval (semi-convex function)

First international summer school on constraint processing

polynomial classes non idempotent weighted cn
Polynomial classesNon idempotent (weighted CN)
  • MaxSat is MAXSNP complete (no PTAS)
  • Weighted MaxSAT is FPNP-complete
  • MaxSAT is FPNP[O(log(n))]complete: weights !
  • MaxSAT tractable langages fully characterized (Creignou 2001)
  • MaxCSP langage: feq(x,y) : (x = y) ? 0 : 1 is NP-hard.

First international summer school on constraint processing

generalized intervals
Generalized intervals…
  • Assume domains are ordered

fc[a,b](x,y): (x∊[a,b]) ? c : 0

is a generalized interval function

Tractable

  • u ≤ x, v ≤ y f(u,v)+f(x,y) ≤ f(u,y)+f(x,v)

is a submodular function: decomposes in GI

Tractable O(n3d3)

ax+by+c, sqrt(x2+y2),|x-y|r (r≥1)…

First international summer school on constraint processing

ressources

Ressources

Solvers

Benchmarks

ressources solvers benchs
Ressources (solvers, benchs…)
  • Con'Flex: fuzzy CSP system with integer, symbolic and numerical constraints (www.inra.fr/bia/T/conflex).
  • clp(FD,S): semi-ring CLP (pauillac.inria.fr/.georget/clp_fds/clp_fds.html).
  • LVCSP: Common-Lisp library for Valued CSP with an emphasis on strictly monotonic operators (ftp.cert.fr/pub/lemaitre/LVCSP).
  • Choco: was a claire library for CSP. Existing layers above Choco implements some WCSP algorithms (choco.sourceforge.net).
  • REES: a C++ library by Radu Matuescu (R. Dechter lab.)
  • Incop: incomplete algorithms (go with the winners)

First international summer school on constraint processing

toolbar and the soft constraints wiki
Toolbar and the soft constraints wiki

Open source library

  • Accessible from the Soft wiki site:

http://carlit.toulouse.inra.fr/cgi-bin/awki.cgi/SoftCSP

  • Implements MNC,MAC,MDAC,MFDAC,MEDAC
  • Reads MaxCSP/SAT (weighted or not) and ERGO bayes nets file format
  • Together with thousands of benchmarks in standardized format
  • Source forge at

http://mulcyber.toulouse.inra.fr/projects/toolbar/

First international summer school on constraint processing

conclusion
Conclusion
  • A large subpart of classic CN body of knowledge has been extended adapted to soft CN and efficient solving tool exist.
  • Much remains to be done:
    • Extension: to other problems than optimization (counting, quantification…)
    • Techniques: symmetries, learning, knowledge compilation…
    • Algorithmic: still better lb, other local consistencies or dominance. Exploiting problem structure.
    • Implementation: better integration with classic CN solver (Choco, Solver)
    • Applications: problem modelling, solving, heuristic guidance, partial solving (see Tools (de Givry et al., 2004)).

First international summer school on constraint processing