1 / 25

Evidence-Based Verification

Evidence-Based Verification. Li Tan Computer Science Department Stony Brook April 2002. Outline. Part I: Evidence-based verification. Motivations. The general framework. Applications. Part II: Evidence-based model checking Checker-independent evidence for model checking.

jerold
Download Presentation

Evidence-Based Verification

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evidence-Based Verification Li Tan Computer Science Department Stony Brook April 2002 Evidence-Based Verification

  2. Outline • Part I: Evidence-based verification. • Motivations. • The general framework. • Applications. • Part II: Evidence-based model checking • Checker-independent evidence for model checking. • Extracting the evidence from existing model checkers. • Post-model-checking analyses based on the evidence. • Efficiently certifying model-checking Result. • Constructing winning strategy for model-checking game. • Evaluating the quality of model-checking process. • A prototype on the Concurrency Workbench (CWB-NC). Evidence-Based Verification

  3. Verification • Automatic verification: whether or not a transition system satisfies a property. • Successful Applications (in Stony Brook alone): Checking communication protocol, Mechanical Design, Medical Device, Anti-Block Braking System, etc. • Verification algorithm (Checker) works as a decision procedure for the problem. • "Yes/No" may not satisfy users. • Why does my design go wrong? • Could my design satisfy property trivially? • Can I trust the verification result? Evidence-Based Verification

  4. Problems with Traditional Diagnostic Generation Diagnosis is about understanding the result, • A diagnostic routine may, • Perform its own reasoning. • Reuse the proof already computed by a checker. • Diagnostic routine is tightly geared to the structure of checkers. • Implementation requires the understanding of checkers. • Migrating a diag. routine onto another checker often requires major changes on both diag. routine and checker. • Proof used for one diagnostic schema may not be used for a different schema. • No additional checking on verificaton result. Evidence-Based Verification

  5. Invalid Proof Portable Proof of Correctness Evidence-Based Verification Let the result carry its own proof Diagnostic Schema 1 Diagnostic Schema 2 Diagnostic Schema m … Verifier Checker 1 Checker 2 Checker n … Evidence-Based Verification

  6. The General Framework • Defining Abstract Proof Structures(APS) as portable evidence. • APS encodes the proof structures of different checkers in a standard form. • APS carries the evidence to justify the result. • Extracting APS from existing checkers. • Utilizing APS to perform diagnoses. • Certifying verification result. • Generating diagnostic information. • Evaluating the quality of verification process. Evidence-Based Verification

  7. Requirements • APS can be extracted from existing checkers. • The extraction should not affect the complexities of checkers. • The consistency of APS should be verified efficiently. • The time and space complexities of certifying APS should not exceed the complexities of checkers producing them. • A variety of diagnoses can be performed using APS. • APS should be defined for three major approaches for verification: model checking, equivalence checking, and preordering checking. Evidence-Based Verification

  8. Evidence Model Checking: a Sub-framework Background of model checking. T²f • T is modeled as a Kripke structure T=<S, sI, !,V> • S is the set of states with the starting state sI2 S. • !µ S £ S is the transition relation. • V: A! 2S is an evaluation for atomic propositions. • f is encoded in some temporal logic. • CTL AG(a ) AF b) • Model-checking problem can be encoded as a Boolean equation system Evidence-Based Verification

  9. Fixpoint Equation System: Syntax Given a set of variables X and a complete lattice {H, <}, • si2 {m, n} is a (least, greatest) fixpoint operator. • fi: HX!H is monotonic. • q2 HX is an environment for E. • {HX, ½}is a complete lattice. • q[X/h] maps X 2X to h 2 H. • Denote E (k) for the tail of E starting from kth equation. Evidence-Based Verification

  10. Equation System: Semantics [E]: HX!HX is a function on environments Evidence-Based Verification

  11. Evidence-Based Verification

  12. Boolean (Fixpoint) Equation System • Syntax, • H={ {0, 1},< } is the Boolean lattice H. • q2 2X can be viewed as a set. • E is closed if X 2Xi also appears as a left side variable. • [E](q1)=[E](q2) for any q1, q22 HX. • Denote [E] for [E](q) • [E](X) assigns X a Boolean value. Evidence-Based Verification

  13. Model Checking via BES • BES E= Kripke structure T+ Property F • E is closed. • A variable X in BES stands for $h s, f’ i$. • [E](X)=1 iff s ²Tf. • Many checkers (implicitly) construct BESs. • For m-calculus checker, BES=T+m-calculus. • For automaton-based checker, BES= parity automaton. • E can be constructed on-the-fly. Evidence-Based Verification

  14. Evaluating Equation System: an Example Evidence-Based Verification

  15. Support Set Evidence-Based Verification

  16. Support Set (Continue) • By (a) and (b), support set implies a fixpoint solution for E. • By (c), support set respects the definition of least/or greatest fixpoints. • If r=1, no bad loop on . • If r=0, no good loop on . Theorem 1 [TanCle02] Let G=<r, X, X> be a support set for E, then [E](X)=r. Evidence-Based Verification

  17. Extracting Support Set The extraction is, • practical. Support sets can be extracted from a wide range of existing checkers, • Boolean-Graph algorithm [And92], Linear Alternation-Free algorithms[CleSte91], On-the-fly algorithms for full m-calculus LAFP [LRS98] and SLP [TanCle02b], Automaton-based model checkers([BhaCle96a] and [KVW00]). • efficient. The overhead doesn't exceed the original complexities of these checkers. • simply. It only need have dependency relations recorded. Evidence-Based Verification

  18. Application I: Certifying model-checking results • Checking (a) and (b) can be done in linear time. • Checking (c) can be reduced to even-loop problem (a nlogn problem[KKV01]). • Model checking is a NP Å co-NP problem [EmeJutSis93]. • The cost of certifying results < The cost of model checking. Evidence-Based Verification

  19. Application II: model-checking game • Semantics: decide [E](X0) for E • Two players: I (asserting [E](X0)=0) and II (asserting [E](X0)=1) • A play is a sequence a=Xp0 Xp1…such that Xp0=X0 and if, • (spi Xpi=ÇX ’) 2E, then II chooses Xpi+12X' • (spi Xpi=ÆX ’) 2E, then I chooses Xpi+12X ’ • II wins a iff, • It's I's turn but I has no choice (X '=;), or, • The shallowest variable being visited infinitely often by a is a n-variable. Evidence-Based Verification

  20. MC Game as a Diagnostic Routine • MC game is a fair game. • ([E])(X0)=1 ) II has a winning strategy. • ([E])(X0)=0 ) I has a winning strategy. • Two physical players: computer and user. • When the model-checking result is, • Yes ) The computer plays as II while the user as I. • No ) The computer plays as I while the user as II. • The user is always a loser if the MC result is correct and the computer uses the right strategy. Evidence-Based Verification

  21. Constructing Winning Strategy for Computer • Given h r, X0, Xi as a support set for E • The computer will keep the play a=Xp0 Xp1… proceeding within support set: • If r=1 and spi Xpi=ÇX ’, then the computer (as II) chooses Xpi+12 (X(Xpi) ÅX '). • If r=0 and spi Xpi=ÆX ’, then the computer (as I) chooses Xpi+12 (X(Xpi) ÅX '). • The strategy is feasible: X(Xpi) is defined whenever Xpi is the computer’s turn. • The strategy is a winning strategy for the computer. Evidence-Based Verification

  22. Evaluating Equation System: an Example Evidence-Based Verification

  23. Application III:Evaluate the quality of MC • A positive result may hide the problem • T may pass AG(a ) AF b) trivially because a never occurs in T. • Is there the status of a state (Minicoverage [CKV01]) or a subformula (Vacuity [KV99]) irrelevant to the result? • Coverage problem of support set. • Has support set covered all the states and properties? Evidence-Based Verification

  24. A Prototype on CWB-NC Evidence-Based Verification

  25. Conclusion Checkers produce abstract proof structures as evidence. • APS is independent of checker. • Extracting APS won't affect the complexities of checkers. • APS justifies the correctness of result. • APs attests to the quality of verification. • A wide range of diagnostic information can be built on this evidence. • APSs are defined for Model checking, Equiv. checking, and Preordering Checking. Evidence-Based Verification

More Related