1 / 61

Faster Extraction of High-Level Minimal Unsatisfiable Cores

Faster Extraction of High-Level Minimal Unsatisfiable Cores. SAT’11 Conference Ann Arbor, USA June 21, 2011 . Ryvchin Vadim and Ofer Strichman Technion, Israel. Agenda. Introduction and motivation Optimizations A. Partial Resolution B. Selective clause minimization

peers
Download Presentation

Faster Extraction of High-Level Minimal Unsatisfiable Cores

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Faster Extraction of High-Level Minimal Unsatisfiable Cores SAT’11 ConferenceAnn Arbor, USA June 21, 2011 Ryvchin Vadim and Ofer Strichman Technion, Israel

  2. Agenda • Introduction and motivation • Optimizations • A. Partial Resolution • B. Selective clause minimization • C . Postponed IC-propagation • E. Selective learning of IC-clauses • G. Removal Strategy • Experimental results • Resolution vs. Selector variables

  3. High-Level UC • Given: • A set of interesting constraints (IC)  = { IC1, IC2 , …, ICm }, and • The remainder  • The set    is a high-level UC (HLUC) if    is unsatisfiable • HLUC is minimal (HLMUC) if removal of any IC makes    satisfiable

  4. Examples • Abstraction-refinement in model checking [MA’03] • Latches in core define the next abstraction. • Compositional Formal Equivalence Checking (FEC) [CGLNR’10] • Decompose the compared circuits to blocks. • Assume inputs to blocks are the same • Assumptions in the core need to be proved.

  5. Traditional UC Extraction:Stage 1: Translate to Clauses The remainder (the rest of the formula) An interesting constraint Each small square is a propositional clause, e.g. (a + b’)

  6. Traditional UC Extraction:Stage 2: Extract a Clause-Level UC The remainder (the rest of the formula) An interesting constraint Coloredsquares belong to the clause-level UC

  7. Traditional UC Extraction:Stage 3: Map UC back to ICs The remainder (the rest of the formula) An interesting constraint The UC contains three interesting constraints!

  8. A Mismatch between Mainstream Research and the Needs of Real-World Applications • Real-world applications: reduce # interesting constraints in the core • Latches/gates for abstraction refinement • Assumptions for compositional FEC • Vast majority ofexisting algorithms: reduce # of clauses in the core • 19/21 papers on UC extraction only consider clause-level UC extraction

  9. Small/Minimal Clause-Level UC  Small/Minimal High-Level UC A small clause-level UC, but the high-level UC is the largest possible: A large clause-level UC, but the high-level UC is empty:

  10. Resolution Refutation C22 C23=() C21 C17 C19 C20 C18 C16 C15 C11 C14 C13 C10 C12 C2 C4 C5 C8 C9 C1 C3 C6 C7 Legend: Input clauses Derived clauses

  11. Resolution Refutation C22 C23=() C21 C17 C19 C20 C18 C16 C15 C11 C14 C13 C10 C12 C2 C4 C5 C8 C9 C1 C3 C6 C7 Legend: Input clauses Derived clauses Empty clause cone: { C4 C5 C6 C7 C13 C14 C19 C20 C23 } Unsat Core: { C4 C5 C6 C7 }

  12. Resolution Refutation C22 C23=() C21 C17 C19 C20 C18 C16 C15 C11 C14 C13 C10 C12 C2 C4 C5 C8 C9 C1 C3 C6 C7 Legend: Empty Clause Cone Unsat Core Empty clause cone: { C4 C5 C6 C7 C13 C14 C19 C20 C23 } Unsat Core: { C4 C5 C6 C7 }

  13. Resolution with ICs C22 C23=() C21 C17 C19 C20 C18 C16 C15 C11 C14 C13 C10 C12 C2 C4 C5 C8 C9 C1 C3 C6 C7 Legend: IC1 IC2 Remainder Input Clauses Derived clauses Derived Clauses

  14. Resolution with ICs C22 C23=() C21 C17 C19 C20 C18 C16 C15 C11 C14 C13 C10 C12 C2 C4 C5 C8 C9 C1 C3 C6 C7 Legend: IC1 IC2 Remainder Input Clauses IC1 IC2 Remainder Derived Clauses

  15. HLUC C22 C23=() C21 C17 C19 C20 C18 C16 C15 C11 C14 C13 C10 C12 C2 C4 C5 C8 C9 C1 C3 C6 C7 Legend: IC1 IC2 Remainder HLUC: { IC2 }

  16. HLMUC Algorithm [N’10] - remainder, - ICs Initialization: ’ =  =  Assumption:  Æis UNSAT Solve Æ UNSAT ’ = HLUC SAT • = ’ •  =  \ ICi  No unchecked ICs Remove one ICiϵ  that wasn’t already removed

  17. Contribution of this Work • Seven optimizations for single HLMUC. • improved run time and smaller HLMUC • Comparison between resolution and selector variables solvers.

  18. A. Partial Resolution • Observations: • IC-clauses usually between 5-15% of the problem clauses • We do not need the whole resolution table • Suggestion: • Keep only clauses relevant to IC resolutions • Result: • The size of the resolution graph reduced • Very effective on large CNFs

  19. A. Partial Resolution C22 C23=() C21 C17 C19 C20 C18 C16 C15 C11 C14 C13 C10 C12 C2 C4 C5 C8 C9 C1 C3 C6 C7 Legend: IC1 IC2 Remainder

  20. A. Partial Resolution C22 C23=() C21 C17 C19 C20 C18 C16 C15 C11 C14 C13 C10 C12 C2 C4 C5 C8 C9 C1 C3 C6 C7 Legend: IC1 IC2 Not Needed

  21. A. Partial Resolution C22 C23=() C21 C17 C20 C16 C15 C10 C8 C1 C7 Legend: IC1 IC2

  22. A. Partial Resolution - Summary C22 C23=() C21 C17 C20 C16 C15 C10 C8 C1 C7 Legend: IC1 IC2 Keep only the needed resolutions

  23. B. Selective clause minimization • Technique for shrinking conflict clauses • The algorithm is based on traversing the resolution DAG backward from each literal in the learned clause • The problem: • May turn a non-IC-clause into a shorter IC-clause

  24. B. Selective clause minimization c1= (¬v1Çv2) c2= (¬v2Çv3) c3= (¬v4Çv5) c4= (¬v5Çv6) c5= (¬v1Ǭv3 Ǭv4 Ǭv6)

  25. B. Selective clause minimization c1= (¬v1Çv2) c2= (¬v2Çv3) c3= (¬v4Çv5) c4= (¬v5Çv6) c5= (¬v1Ǭv3 Ǭv4 Ǭv6) c1 c2 v1 v2 v3

  26. B. Selective clause minimization c1= (¬v1Çv2) c2= (¬v2Çv3) c3= (¬v4Çv5) c4= (¬v5Çv6) c5= (¬v1Ǭv3 Ǭv4 Ǭv6) c1 c2 v1 v2 v3 c5 c5 ¬v6 c5 v4 c3 c4 v5 v6

  27. B. Selective clause minimization c1= (¬v1Çv2) c2= (¬v2Çv3) c3= (¬v4Çv5) c4= (¬v5Çv6) c5= (¬v1Ǭv3 Ǭv4 Ǭv6) c1 c2 v1 v2 v3 c5 c5 ¬v6 c5 v4 c3 c4 v5 v6

  28. B. Selective clause minimization c1= (¬v1Çv2) c2= (¬v2Çv3) c3= (¬v4Çv5) c4= (¬v5Çv6) c5= (¬v1Ǭv3 Ǭv4 Ǭv6) 1-UIP based conflict analysis: c6= (¬v1Ǭv3 Ǭv4) c1 c2 v1 v2 v3 c5 c5 ¬v6 c5 v4 c3 c4 v5 v6

  29. B. Selective clause minimization c1= (¬v1Çv2) c2= (¬v2Çv3) v1 v3 ¬v3 ¬v1

  30. B. Selective clause minimization c1= (¬v1Çv2) c2= (¬v2Çv3) c6= (¬v1Ǭv3 Ǭv4) c6= (¬v1Ǭv4) v1 v3 ¬v3 ¬v1

  31. B. Selective clause minimization c1= (¬v1Çv2) c2= (¬v2Çv3) c3= (¬v4Çv5) c4= (¬v5Çv6) c5= (¬v1Ǭv3 Ǭv4 Ǭv6) c6= (¬v1Ǭv4) c1 c2 v1 v2 v3 c5 c5 ¬v6 c5 v4 c3 c4 v5 v6

  32. B. Selective clause minimization c1= (¬v1Çv2) c2= (¬v2Çv3) (IC) c3= (¬v4Çv5) c4= (¬v5Çv6) c5= (¬v1Ǭv3 Ǭv4 Ǭv6) c1 c2 v1 v2 v3 c5 c5 ¬v6 c5 v4 c3 c4 v5 v6

  33. B. Selective clause minimization c1= (¬v1Çv2) c2= (¬v2Çv3)(IC) With minimization using c2: c6= (¬v1Ǭv4)(IC) Without minimization: c6= (¬v1Ǭv3 Ǭv4) (remainder)

  34. B. Selective clause minimization • Suggested solution: • Disable minimization if it adds dependency on IC-clause. • c6= (¬v1Ǭv3 Ǭv4) instead of c6= (¬v1Ǭv4) • Disabling minimization  reduces #derived IC-clauses  reduces #IC-clauses in UC and finds HLMUC faster

  35. C. Postpone IC-propagations • Change BCP order Run BCP Conflict Analyze Conflict no implications Next Operations

  36. C. Postpone IC-propagations • Change BCP order Run BCP over non IC-clause Conflict Analyze Conflict found implication no implications Propagate a single IC-clause Conflict no implications Next Operations

  37. C. Postpone IC-propagations • Increase chances to get conflicts in remainder • Decreases number of derived IC-clauses • Decreases number of IC-clauses in UC.

  38. E. Selective Learning implication IC-clause implication @2 @2 @5 @5 @5 @5 @5 @5 @5 X @5 @5 @5 @3 @3

  39. E. Selective Learning implication IC-clause implication @2 @2 @5 @5 @5 @5 @5 @5 @5 X @5 @5 @5 @3 @3

  40. E. Selective Learning implication IC-clause implication @2 @2 @5 @5 @5 @5 @5 @5 @5 X @5 @5 @5 @3 @3 Learnt clause should be marked “IC-clause”

  41. E. Selective Learning • We refrain from learning IC-clauses • Instead, • do not learn it • learn a (non-asserting) remainder clause • make a decision • How ?

  42. E. Selective Learning • How ? • Treat the last IC-clause implication as decision • Perform new 1-UIP conflict analysis • The learnt clause is ‘remainder’

  43. E. Selective Learning implication IC-clause implication @2 @2 @5 @5 @5 @5 @5 @5 @5 X @5 @5 @5 @3 @3

  44. E. Selective Learning implication IC-clause implication @2 @2 @5 @5 @6 @5 @5 @5 @6 X @5 @5 @6 @3 @3

  45. E. Selective Learning implication IC-clause implication @2 @2 @5 @5 @6 @5 @5 @5 @6 X @5 @5 @6 @3 @3

  46. E. Selective Learning implication IC-clause implication @2 @2 @5 @5 @6 @5 @5 @5 @6 X @5 @5 @6 @3 @3

  47. G. Removal Strategy Recall: In each iteration one IC is chosen to be removed. Solve Æ UNSAT ’ = HLUC SAT • = ’ •  =  \ ICi  No unchecked ICs Remove one ICiϵ  that wasn’t already removed

  48. G. Removal Strategy • What is the effect of the removal order? • Which IC should we remove first?

  49. G. Removal Strategy • Criterion: #clauses in UC • Choose the one • that contains least clauses in UC • If UNSAT (not necessary), will converge faster • If UNSAT (not necessary), will likely allow further removals • that contains most clauses in UC • If SAT (necessary), clauses are added as ‘remainder’ fast

  50. Experimental Results • Benchmark Set: • Industrial set of problems from Intel • Average #clauses = 2,572,270 • Average #ICs = 3804 • Average #IC-clauses = 96568 (6% of #clauses) • Machines: • Intel® Xeon® 4Ghz 32Gb of memory

More Related