evolving local search heuristics for sat using genetic programming l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Evolving Local Search Heuristics for SAT Using Genetic Programming PowerPoint Presentation
Download Presentation
Evolving Local Search Heuristics for SAT Using Genetic Programming

Loading in 2 Seconds...

play fullscreen
1 / 28

Evolving Local Search Heuristics for SAT Using Genetic Programming - PowerPoint PPT Presentation


  • 116 Views
  • Uploaded on

Evolving Local Search Heuristics for SAT Using Genetic Programming. Alex Fukunaga Computer Science Department University of California, Los Angeles. Outline. Satisfiability testing Local search for SAT CLASS: A GP system for discovering SAT local search heuristics

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Evolving Local Search Heuristics for SAT Using Genetic Programming' - omer


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
evolving local search heuristics for sat using genetic programming

Evolving Local Search Heuristics for SAT Using Genetic Programming

Alex Fukunaga

Computer Science Department

University of California, Los Angeles

outline
Outline
  • Satisfiability testing
  • Local search for SAT
  • CLASS: A GP system for discovering SAT local search heuristics
  • Empirical evaluation of CLASS
propositional satisfiability sat
Propositional Satisfiability (SAT)
  • Given:
    • Set of boolean variables
    • A well formed formula using the variables
  • Problem: Is there some assignment of truth values (true, false) to the variables such that the formula evaluates to true?
  • Examples:

1) (a or b or ~c) and (~a or c) and (~a or ~b or c)

Satisfiable (let a=true, b=false, c=true)

2) (a or b or c) and (~a or ~b) and (~b or ~c) and (~a or ~c)

Unsatisfiable.

slide4
SAT
  • NP-complete
    • First problem to be proven NP-complete (Cook, 1973)
  • Numerous applications
    • Circuit verification
    • Scheduling
    • AI Planning
    • Theorem proving
  • Very active research area
    • Recent applications in VLSI CAD
    • Annual SAT conference
local search for sat
Local Search for SAT
  • Local search is very successful at solving hard, satisfiable SAT instances.
  • Many local search algorithm variants for SAT proposed since the discovery of GSAT [Selman, Mitchell, Levesque 1992].
  • Generic SAT local search framework:

T: = randomly generated truth assignment

For j:= 1 to cutoff

If T satisfies formula then return T

V:= Choose a variable using some variable selection heuristic

T’ := T with value of V reversed

Return failure (no satisfying assignment found).

definitions
Definitions
  • Positive/Negative/Net Gain
    • Given candidate variable assignment T for CNF formula F, let B0 be the total # of clauses that are currently unsatisfied in F. Let T’ be the state of F if variable V is flipped. Let B1 be the total # of clauses which would be unsatisfied in T’.
    • The net gain of V is B1-B0. The negative gain of V is the # of clauses which are satisfied in T but unsatisfied in T’. The positive gain of V is the # of clauses which are unsatisfied in T but satisfied in T’.
  • Variable Age
    • The age of a variable is the # of flips since it was last flipped.
sat local search heuristics gsat family
SAT local search heuristics: GSAT Family
  • GSAT [Selman, Mitchell, Levesque 1992] – Select variable with highest net gain. Break ties randomly.
    • “make greedy moves”
  • HSAT [Gent and Walsh, 1993] – Same as GSAT, but break ties in favor of maximum age variable.
    • “make greedy moves,
  • GWSAT [Selman, Kautz 1993] – with probability p, select variable in randomly unsatisfied (broken) clause; otherwise same as GSAT.
sat local search walksat
SAT Local Search: Walksat
  • Walksat [Selman, Kautz, Cohen 1994] – Pick random broken clause BC. If any variable in BC has negative gain of 0, randomly select one. Otherwise, with probability p, select random variable from BC to flip, and with probability (1-p), select variable in BC with minimal negative gain (break ties randomly).
  • Beating a correct, optimized implementation of Walksat is difficult.
  • Many variants of “walksat” appear in literature
    • Many of them are incorrect interpretations/ implementations of [Selman, Kautz, Cohen] Walksat.
    • Most of them perform significantly worse.
    • None of them perform better.
sat local search novelty family
SAT Local Search: Novelty Family
  • Novelty [McAllester, Selman, Kautz 1997] – Pick random unsatisfied clause BC. Select variable v in BC with maximal net gain, unless va has the minimal age in BC. In the latter case, select v with probability (1-p); otherwise, flip v2 with 2nd highest net gain.
  • Novelty+ [Hoos and Stutzle 2000] – Same as Novelty, but after BC is selected, with probability pw, select random variable in BC; otherwise continue with Novelty
  • R-Novelty [McAllester, Selman, Kautz 1997] and R-Novelty+ [Hoos and Stutzle 2000]– similar to Novelty/Novelty+, but more complex.
structure of the standard sat heuristics
Structure of the standard SAT heuristics
  • Score variables with respect to gain metric
    • Walksat uses negative gain,
    • GSAT and Novelty use net gain.
  • Restricted set of candidate variables
    • GSAT – any var in formula;
    • Walksat/Novelty – pick variable from a single random unsatisfied clause
  • Ranking of variables with respect to scoring metric.
    • Greedy choices considered by all heuristics. Novelty variants also consider 2nd best variable.
  • Variable Age – prevent cycles and force exploration
    • Novelty variants, HSAT, Walksat+tabu
  • Conditional branching: Some boolean condition (random number of function of above primitives) evaluated as basis for branch in decision process
  • Heuristic has, compact, unobvious combinatorial structure
some observations on composite variable selection heuristics
Some observations on composite variable selection heuristics
  • Difficult to determine a priori how effective any given heuristic is.
  • Many heuristics look similar to Walksat, but performance varies significantly (c.f. [McAllester, Selman, Kautz 1997]
  • Empirical evaluation necessary to evaluate complex heuristics. Previous efforts to discover new heuristics involved significant experimentation.
  • Humans are skilled at identifying primitives, but find it difficult/time-consuming to combine the primitives into effective composite heuristics.
    • Let’s automate the grunt-work!
previous applications of evolutionary computing to sat
Previous Applications of Evolutionary Computing to SAT
  • Evolve candiate solutions (bit strings representing variable assignments).
    • Problems:
      • Ineffective search
      • Slow - sEvaluation of candidates O(# variables)
        • Vs. Local search O(1) generation + evaluation of new candidate variable assignment.
  • Relatively successful approaches
    • Hybrid EC + Local Search
    • EC that is similar to local search
  • See recent survey by [Gottlieb, Marchiori, Rossi, 2002]
standard genetic local search hybrids
Standard Genetic/Local Search Hybrids

Genetic algorithm generates solutions which are improved by local search

Solution

Problem

Instance

Genetic

Algorithm

Local Search

Improved Solution

Search Control

Heuristic

(Walksat, GSAT)

Similar: Evolutionary algorithm with very small (< 10) population, relying mostly on single-bit mutation operator.

class c omposite heuristic l earning a lgorithm for s at local s earch
CLASS (Composite heuristic Learning Algorithm for SAT local Search
  • Language for expressing variable selection heuristics
  • Meta-level search algorithm to explore space of possible selection heuristics expressible in the language.
class
CLASS
  • GP evolves local search control heuristics that perform well on training instances.

Training

Problem

Instances

Local Search

Search Control

Heuristic

Off-line learning

Genetic

Programming

Solution

score

Evolved heuristics are then applied to new problem instances.

Local Search

Evolved

Search Control

Heuristic

Problem

Instance

Solution

class implementations of standard sat heuristics
CLASS Implementations of Standard SAT Heuristics
  • GSAT with Random Walk (GWSAT)

(If (rand 0.5)

RandomVarBC0

VarBestNetGainWFF)

  • Walksat

(IfVarCond == NegativeGain 0

VarBestNegativeGainBC0

(If (rand 0.5)

VarBestNegativeGainBC0

RandomVarBC0))

  • Novelty

(IfNotMinAge BC0 VarBestNetGainBC0

(If (rand 0.5)

VarBestNetGainBC0

VarSecondBestNetGainBC0))

class learning algorithm gp variant
CLASS Learning Algorithm (GP variant)
  • Initialize(population)
  • For I=1 to MaxIterations
    • Select parent1 and parent 2 from population
    • Children = Compose(parent1, parent2)
    • Evaluate(Children)
    • Insert(Children, Population)
composition operator
Composition Operator
  • Given two heuristics H1 and H2, combine the two into a new heuristic that chooses between H1 and H2 using schema:

If Condition H1 else H2

Example:

H1 = (OlderVar VarBestNetGainBC0 VarBestNetGainBC1)

H2 = VarBestNegativeGainBC0

Compose (H1,H2, (If (rand 0.5))) =

(If (rand 0.5)

(OlderVar VarBestNetGainBC0 VarBestNetGainBC1)

VarBestNegativeGainBC0)

  • Similar to crossover at 0-depth.
why use the composition operator
Why Use the Composition Operator?
  • Humans seem to have done this successfully (deriving GWSAT from GSAT, Novelty+ from Novelty, etc.)
  • Probabilistic Composition – special case of composition where Condition is a randomization function:

(If (rand < p) then H1 else H2)

  • Theorem: Probabilistic composition preserves PAC property
      • PAC (Probabilistically, Approximately, Complete) –probabilistically guaranteed to find solution to a satisfiable instnace [Hoos 1998].
empirical evaluation
Empirical Evaluation
  • Evolve heuristics using hard, random 3-SAT instances (250 variables, 1043 clauses) as training set.
    • Population size = 1000, Generate 4500 children (< 1 day)
  • Compare evolved heuristics with standard heuristics on benchmark problem instances.
    • Walksat-v43 : Highly optimized, frequently updated C implementation of Walksat maintained by Henry Kautz
    • Novelty+ : also part of Walksat-v43 code
      • Tuned control parameter settings for best possible performance on our benchmarks, best results shown here.
      • Update: Kautz’s Walksat code placed 2nd in recent (annual) SAT-2004 competition in nonsystematic algorithm category.
    • RFEA2+, ASAP are state of the art evolutionary algorithms
      • Efficient, but slow (clause-weighting -> slow flips); similar to algorithms such as SDF, DLM, ESG which, at best run as fast as Novelty.
  • Note: CLASS currently implemented in Common Lisp (CMUCL),
    • 2-4x slower than C on this type of code even when optimized (doesn’t handle 1-D array accesses and char arrays as well as C).
comparison of class evolved heuristics vs previous evolutionary algorithms and standard heuristics
Comparison of CLASS-evolved heuristics vs. previous evolutionary algorithms and standard heuristics

300,000 flips per run (10 runs/instance, 50 instances/suite)

SR = % of runs that found a solution (higher=better)

AFS = Average # of flips (iterations) to find solution (smaller=better)

Time = average runtime of successful runs

class evolved heuristics vs standard heuristics longer runs
CLASS-evolved heuristics vs. standard heuristics (longer runs)

100,000,000 flips per run (1 run per instance, 50 instances per suite)

SR = % of runs that found a solution (higher=better)

AFS = Average # of flips (iterations) to find solution (smaller=better)

Time = average runtime of successful runs

evolved heuristics generalize well
Evolved Heuristics Generalize Well
  • Using a training set of 100-250 variable phase transition instances results in robust heuristics that perform well on:
    • Problems from the same class as training set
    • Bigger problems
    • Standard benchmarks from different completely different problem classes (AI Planning, circuit verification, graph coloring).
    • See AAAI 2002 paper.
analysis of heuristics learned by class
Analysis of Heuristics Learned by CLASS
  • What are the learned heuristics doing?
  • Why do learned heuristics generalize?
  • Analyze heuristics using some intuitive metrics proposed by [Schuurmans and Southey, 2000].
    • Depth: Greediness/Aggressiveness of algorithm
    • Mobility: How fast algorithm moves between attractors
    • Coverage: How well is algorithm sampling the total search space.
analysis of local search behavior
Analysis of Local Search Behavior
  • Depth, Mobility, and Coverage of SAT heuristics. Mean of 10 runs/instance on 100 instances of 100 variable/430 clause hard, satisfiable instances.
  • SR=success rate. AFS = Average # of Flips to find Solution (in successful runs)
does the gp really work
Does the GP Really Work?
  • Is the GP searching the space of heuristics effectively?
  • Compare vs. Generate-and-Test (random sampling).
    • Compare 10 runs of GP and G&T 10.
      • 5500 heuristics evaluated per run
        • G&T: Generate and evaluate 5500 heuristics
        • GP: population size=1000, terminate after 4500 children generated and evaluated.
        • Depth limit=6.
      • Mean value of the best individual found in each run:
        • G&T = 1213, GP=130
      • Mean of final populations in GP = 1306
      • Mean of all individuals generated by G&T = 411.
      • GP found better solutions, focusing search on high-quality individuals
        • NOT just “getting lucky” and generating a single better individual than G&T
summary of results
Summary of Results
  • CLASS-evolved heuristics outperform
    • Highly tuned implementations of standard heuristics (Novelty+, Walksat)
    • Previous evolutionary approaches (RFEA2+, ASAP).
    • More efficient, and, more importantly
    • Run faster (total runtime)
      • Room for further improvement (CLASS implemented in Common Lisp, 2-3x slower than C).
  • Evolved heuristics evolved on hard (phase transition) random 3-SAT problems generalize well to other classes of SAT problems (planning, graph coloring, all-interval-series) [Fukunaga 2002].
  • Analysis of depth, mobility, and coverage metrics show that evolved heuristics are intuitively “doing the right thing”.
  • GP is effectively searching space of local search heuristics.
conclusion class is a human competitive system for sat heuristic discovery
Conclusion: CLASS is a Human Competitive system for SAT heuristic discovery.
  • SAT is a widely studied, difficult, NP-complete problem.
  • CLASS uses GP to automate the task of generating good, composite SAT local search heuristics.
  • CLASS generates heuristics that are competitive with state of the art human-designed SAT local search heuristics.
    • Evolved heuristics significantly outperform heuristics that were state of the art and considered significant achievements at the time of their publication (Novelty/+, Walksat, GSAT).
  • Future Work: Entry in SAT-2005 competition.