1 / 28

Evidence-Based Model Checking

Evidence-Based Model Checking. Li Tan, Rance Cleaveland Presented by Arnab Ray Computer Science Department Stony Brook July 2002. Outline. Motivations. Checker-independent evidence for model checking. Post-model-checking analyses based on the evidence.

aaralyn
Download Presentation

Evidence-Based Model Checking

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evidence-Based Model Checking Li Tan, Rance Cleaveland Presented by Arnab Ray Computer Science Department Stony Brook July 2002 Evidence-Based Verification

  2. Outline • Motivations. • Checker-independent evidence for model checking. • Post-model-checking analyses based on the evidence. • Efficiently certifying model-checking Result. • Generating diagnostic information. • Evaluating the quality of model-checking process. • A prototype on the Concurrency Workbench (CWB-NC). Evidence-Based Verification

  3. Model Checking • Model Checking: whether or not a transition system satisfies a temporal property. • Model checker works as a decision procedure for the problem. • "Yes/No" may not satisfy users. • Why does my design go wrong? • Could my design satisfy property trivially? • Can I trust the verification result? Evidence-Based Verification

  4. Problems with Traditional Diagnostic Generation Diagnosis is about understanding the result, • A diagnostic routine may, • Perform its own reasoning, or, • Reuse the proof already computed by a checker. • Diagnostic routine is tightly geared to the structure of checkers. • Implementation requires the understanding of checkers. • Migrating a diag. routine onto another checker often requires major changes on both diag. routine and checker. • Proof used for one diagnostic schema may not be used for a different schema. • No additional checking on model-checking result. Evidence-Based Verification

  5. Invalid Proof Portable Proof of Correctness Evidence-Based Model Checking Let the result carry its own proof Diagnostic Schema 1 Diagnostic Schema 2 Diagnostic Schema m … Verifier Checker 1 Checker 2 Checker n … Evidence-Based Verification

  6. The General Framework • Defining an abstract proof structures(APS) as checker-independent evidence. • APS encodes the proof structures of different checkers in a standard form. • APS carries the evidence to justify the result. • Extracting APS from existing checkers. • Utilizing APS to perform diagnoses. • Certifying verification result. • Generating diagnostic information. • Evaluating the quality of verification process. Evidence-Based Verification

  7. Searching for APS • APS should be extracted from existing checkers. • The extraction should not affect the complexities of checkers. • The consistency of APS should be verified efficiently. • The complexities of certifying APS should not exceed the complexities of checkers producing it. • APS should be abstract enough to save the space • APS should be rich enough for supporting a variety of diagnoses. Evidence-Based Verification

  8. Introducing APS by case study Evidence-Based Verification

  9. Boolean Equation System=System + Temporal Property E=F+T: Evidence-Based Verification

  10. Boolean Equation System=System + Temporal Property E=F+T: Evidence-Based Verification

  11. Equation System: Semantics [E]: HX!HX is a function on environments Evidence-Based Verification

  12. Evidence-Based Verification

  13. Boolean (Fixpoint) Equation System • Syntax, • H={ {0, 1},< } is the Boolean lattice H. • q2 2X can be viewed as a set. • E is closed if X 2Xi also appears as a left side variable. • [E](q1)=[E](q2) for any q1, q22 HX. • Denote [E] for [E](q) • [E](X) assigns X a Boolean value. Evidence-Based Verification

  14. Model Checking via BES • BES E= Kripke structure T+ Property F • E is closed. • A variable X in BES stands for $h s, f’ i$. • [E](X)=1 iff s ²Tf. • Many checkers (implicitly) construct BESs. • For m-calculus checker, BES=T+m-calculus. • For automaton-based checker, BES= parity automaton. • E can be constructed on-the-fly. Evidence-Based Verification

  15. Evaluating Equation System: an Example Evidence-Based Verification

  16. Support Set Evidence-Based Verification

  17. Support Set (Continue) • By (a) and (b), support set implies a fixpoint solution for E. • By (c), support set respects the definition of least/or greatest fixpoints. • If r=1, no bad loop on . • If r=0, no good loop on . Theorem 1 [TanCle02] Let G=<r, X, X> be a support set for E, then [E](X)=r. Evidence-Based Verification

  18. Extracting Support Set The extraction is, • practical. Support sets can be extracted from a wide range of existing checkers, • Boolean-Graph algorithm [And92], Linear Alternation-Free algorithms[CleSte91], On-the-fly algorithms for full m-calculus LAFP [LRS98] and SLP [TanCle02b], Automaton-based model checkers([BhaCle96a] and [KVW00]). • efficient. The overhead doesn't exceed the original complexities of these checkers. • simply. It only need have dependency relations recorded. Evidence-Based Verification

  19. Application I: Certifying model-checking results • Checking (a) and (b) can be done in linear time. • Checking (c) can be reduced to even-loop problem (a nlogn problem[KKV01]). • Model checking is a NP Å co-NP problem [EmeJutSis93]. • The cost of certifying results < The cost of model checking. Evidence-Based Verification

  20. Application II: model-checking game • Semantics: decide [E](X0) for E • Two players: I (asserting [E](X0)=0) and II (asserting [E](X0)=1) • A play is a sequence a=Xp0 Xp1…such that Xp0=X0 and if, • (spi Xpi=ÇX ’) 2E, then II chooses Xpi+12X' • (spi Xpi=ÆX ’) 2E, then I chooses Xpi+12X ’ • II wins a iff, • It's I's turn but I has no choice (X '=;), or, • The shallowest variable being visited infinitely often by a is a n-variable. Evidence-Based Verification

  21. MC Game as a Diagnostic Routine • MC game is a fair game. • ([E])(X0)=1 ) II has a winning strategy. • ([E])(X0)=0 ) I has a winning strategy. • Two physical players: computer and user. • When the model-checking result is, • Yes ) The computer plays as II while the user as I. • No ) The computer plays as I while the user as II. • The user is always a loser if the MC result is correct and the computer uses the right strategy. Evidence-Based Verification

  22. Constructing Winning Strategy for Computer • Given h r, X0, Xi as a support set for E • The computer will keep the play a=Xp0 Xp1… proceeding within support set: • If r=1 and spi Xpi=ÇX ’, then the computer (as II) chooses Xpi+12 (X(Xpi) ÅX '). • If r=0 and spi Xpi=ÆX ’, then the computer (as I) chooses Xpi+12 (X(Xpi) ÅX '). • The strategy is feasible: X(Xpi) is defined whenever Xpi is the computer’s turn. • The strategy is a winning strategy for the computer. Evidence-Based Verification

  23. Evaluating Equation System: an Example Evidence-Based Verification

  24. Application III:Evaluate the quality of MC • A positive result may hide the problem • T may pass AG(a ) AF b) trivially because a never occurs in T. • Is there the status of a state (Minicoverage [CKV01]) or a subformula (Vacuity [KV99]) irrelevant to the result? • Coverage problem of support set. • Has support set covered all the states and properties? Evidence-Based Verification

  25. Furture Work I:A Client-Server Model for Verification • Server: checkers. • There are many formulations for the input • Support sets help standardize the output. • Client: user interface, diagnostic generation, and evidence-verifier. Abstract Proof Structures Design Systems and Properties Evidence-Based Verification

  26. Future Work II:Proof-Carrying Code • Mobile code [Nec97] carries its own proof attesting to its safeness. • Currently compilers are modified to produce the proof for a predefined set of safety rules. • Integrate support-set-ready model checkers with compilers. • Certifying compiler enjoy the richness of temporal logics. Evidence-Based Verification

  27. A Prototype on CWB-NC Evidence-Based Verification

  28. Conclusion Checkers produce abstract proof structures as evidence. • APS is independent of checker. • Extracting APS won't affect the complexities of checkers. • APS justifies the correctness of result. • APs attests to the quality of verification. • A wide range of diagnostic information can be built on this evidence. • APSs are defined for Model checking, Equiv. checking, and Preordering Checking. Evidence-Based Verification

More Related