1 / 13

BLACK BOX INVARIANT SYNTHESIS

BLACK BOX INVARIANT SYNTHESIS. Joint work with: P. madhusudan ( uiuc ), christof loding (RWTH AAchen ) daniel neider (RWTH Aachen) In consultation with: dan roth (UIUC). Pranav Garg University of illinois at urbana-champaign. Automated Program verification.

deanna
Download Presentation

BLACK BOX INVARIANT SYNTHESIS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. BLACK BOX INVARIANT SYNTHESIS Joint work with: P. madhusudan (uiuc), christofloding (RWTH AAchen) danielneider (RWTH Aachen) In consultation with: danroth (UIUC) PranavGarg University of illinois at urbana-champaign

  2. Automated Program verification • Does a program P meet its safety specification ? • Floyd-Hoare style Deductive Verification • To alleviate this burden of annotation  Invariant synthesis pre: true foo(int x) { y := x; while (x != 0) { x := x - 1; y := y - 1; } return y; } Invariant: x = y post: y = 0

  3. Invariant synthesis I1 I2 • Reach – post-closed, contains Init avoids Bad • Adequate Invariant: any post-closed set, contains Init, avoids Bad • --- Many choices; which one to pick? • Learn the “simplest” invariant! Bad Reach Init

  4. Syntax guided synthesis Synthesize Inv such that Can be reduced to Syntax-guided synthesis (SyGus)

  5. Black-box learning of invariants concrete data-points H (hypothesis) Program Learner Constraint Solver check hypothesis? Teacher

  6. active learning of invariants concrete data-points Bad H H (hypothesis) Reachk Init Program Learner Constraint Solver check hypothesis? Teacher

  7. active learning of invariants concrete data-points Bad H H (hypothesis) Reachk Init Program Learner Constraint Solver check hypothesis? Teacher

  8. active learning of invariants concrete data-points Bad H H (hypothesis) Reachk Init p’ p Program Learner Constraint Solver check hypothesis? Teacher

  9. ICE: Learning using examples, counter-examples and implications • To refute non-inductiveness of H, the teacher communicates (p, p’) • - if then -- learner’s choice depends on simplicity, etc. • Robust framework • Ensures progress and honest teacher • Strong convergence • - Can the learner eventually learn an invariant irrespective of the teacher’s answers ? concrete data-points H (hypothesis) Program Learner (Passive) Constraint Solver check hypothesis? Teacher

  10. ) ( ice-learning numerical invariants Learning algorithm is strongly convergent Implemented as a tool over Boogie (from Microsoft Research) - - j i

  11. ice-learning numerical invariants

  12. learning data structure invariants • Learn invariants over linear data structures (arrays and lists) • Unbounded size of the heap • - properties are universally quantified • General form: • Strongly convergent algorithms for learning quantified invariants. • Quantified Data Automata (QDA) normal form for such invariants • - adapt passive Regular Positive Negative Inference (RPNI) for learning QDAs head list pointed to by headis sorted 5 7 8 9

  13. Ongoing Work • Applying machine learning algorithms for synthesizing invariants • Applying machine learning algorithms to program synthesis • Synthesizing invariants for separation logic.

More Related