1 / 27

Randal E. Bryant

Word-Level Modeling and Verification of Systems Using Selective Term-Level Abstraction. Randal E. Bryant. Carnegie Mellon University. Sanjit A. Seshia. U.C., Berkeley. SRC ‘07. Task ID: 1355.001 Technical Thrust: Verification Task Leader: Randal E. Bryant

ccrisp
Download Presentation

Randal E. Bryant

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Word-Level Modeling and Verification of Systems Using Selective Term-Level Abstraction Randal E. Bryant Carnegie Mellon University Sanjit A. Seshia U.C., Berkeley SRC ‘07

  2. Task ID: 1355.001 • Technical Thrust: Verification • Task Leader: Randal E. Bryant • PIs: R. E. Bryant (CMU), S. A. Seshia (UC Berkeley) • Student: Bryan Brady (UC Berkeley, exp. grad. 8/2010) • Industrial Liaisons: Steven M. German, IBM Zurab Khasidashvili, Intel Andreas Kuehlmann, Cadence Hillel Miller, Freescale Carl-Johan Seger, Intel M. Alper Sen, Freescale Eli Singerman, Intel Jin Yang, Intel Hai Vo-Ba, AMD • ITRS Grand Challenge: 2003.12 -- Scaling of Maximum Quality Design Implementation Productivity

  3. Modeling Data in Formal Verification • Symbolic or integer data • Uninterpreted functions & predicates • Fixed-width words of bits • Specific encodings • Standard arithmetic and logical operators • Individual bits • Boolean operations Term Level Bit Vector Level Bit Level

  4. x0 x1  x2 x ALU ALU xn-1  f Term-Level Modeling • View Data as Symbolic “Terms” • Arbitrary integer values • Can store in memories & registers • Abstract Functional Units as “Black Boxes” • Uninterpreted functions

  5. Formal Verification Tools • Term-level verifiers • E.g., UCLID • Able to scale to much more complex systems • Model checkers, equivalence checkers, … • Capacity limited by too many state bits & details of bit manipulations Term Level Bit Vector Level Bit Level

  6. Bit Blast Creating Models • UCLID HDL • Nonstandard • Difficult to reconcile with actual design • Register-Transfer Level • E.g., Verilog • Gate level Term Level Bit Vector Level Bit Level

  7. UCLID Capabilities • Term-level models • Bit-vector models Semiautomatic Abstraction Project Directions Term Level Bit Vector Level Bit Level

  8. Project Directions • Bit-Vector Decision Procedures • Enables UCLID to model at bit-vector level • Direct path from RTL to verifier • Of interest to larger community • Hardware, software, microcode verification • Hardware & software testing • Term-Level Abstraction • Semiautomatic ways to generate term-level model from RTL • Combined Effect • Verify using mixed term and bit-vector models • Range of trade-offs between modeling detail and verifier capacity

  9. Bit-Vector Decision Procedure Example • Do these functions produce identical results? • Strategy • Represent and reason about bit-level program behavior • Specific to machine word size, integer representations, and operations int abs(int x) { int mask = x>>31; return (x ^ mask) + ~mask + 1; } int test_abs(int x) { return (x < 0) ? -x : x; }

  10. BV Decision Procedures:Some History • B.C. (Before Chaff) • String operations (concatenate, field extraction) • Linear arithmetic with bounds checking • Modular arithmetic • SAT-Based “Bit Blasting” • Generate Boolean circuit based on bit-level behavior of operations • Convert to Conjunctive Normal Form (CNF) and check with best available SAT checker • Handles arbitrary operations • Effective in many applications • CBMC [Clarke, Kroening, Lerda, TACAS ’04] • Microsoft Cogent + SLAM [Cook, Kroening, Sharygina, CAV ’05] • CVC-Lite [Dill, Barrett, Ganesh], Yices [deMoura, et al], STP

  11. Challenge for BV-DPs • Is there a better way than bit blasting? • Requirements • Provide same functionality as with bit blasting • Find abstractions based on word-level structure • Improve on performance of bit blasting • A New Approach • Bryant, Kroening, Ouaknine, Seshia, Stichman, Brady, TACAS ’07 • Use bit blasting as core technique • Apply to simplified versions of formula • Successive approximations until solve or show unsatisfiable

  12. + Overapproximation More solutions: If unsatisfiable, then so is    + Fewer solutions: Satisfying solution also satisfies  −   Underapproximation − Approximating Formula • Example Approximation Techniques • Underapproximating • Restrict word-level variables to smaller ranges of values • Overapproximating • Replace subformula with Boolean variable  Original Formula

  13. Starting Iterations • Initial Underapproximation • (Greatly) restrict ranges of word-level variables • Intuition: Satisfiable formula often has small-domain solution  1−

  14. 1+ UNSAT proof: generate overapproximation If SAT, then done First Half of Iteration • SAT Result for 1− • Satisfiable • Then have found solution for  • Unsatisfiable • Use UNSAT proof to generate overapproximation 1+ • Replace irrelevant predicates with Boolean variables  1−

  15. If UNSAT, then done SAT: Use solution to generate refined underapproximation 2− Second Half of Iteration • SAT Result for 1+ • Unsatisfiable • Then have shown  unsatisfiable • Satisfiable • Solution indicates variable ranges that must be expanded • Generate refined underapproximation 1+  1−

  16. Iterative Behavior 2+ 1+ • Underapproximations • Successively more precise abstractions of  • Allow wider variable ranges • Overapproximations • No predictable relation • UNSAT proof not unique    k+  k−    2− 1−

  17. 2+ 1+    UNSAT k+  k−    2− 1− SAT Overall Effect • Soundness • Only terminate with solution on underapproximation • Only terminate as UNSAT on overapproximation • Completeness • Successive underapproximations approach  • Finite variable ranges guarantee termination • In worst case, get k−  

  18. Results: UCLID BV vs. Bit-blasting • UCLID always better than bit blasting • Generally better than other available procedures • SAT time is the dominating factor [results on 2.8 GHz Xeon, 2 GB RAM]

  19. Future Work in BV DPs • Lots of Refinement & Tuning • Selecting under- and over-approximations • Iterating within under- or over-approximation • Reusing portions of bit-blasted formulas • Take advantage of incremental SAT • Additional Abstractions • View term-level modeling as overapproximation technique • Apply functional abstraction automatically

  20. Bit-Vector Level / Term Level Experimental Comparison • What Is Performance Advantage of Term-Level Modeling? • Experiment • Multiple microprocessor designs • Each at varying levels of detail • Ranging from complete bit-vector modeling to complete term-level modeling

  21. Experiment: Y86 Processors • Y86 • 5 stage pipeline • single-threaded • in-order execution • simplified x86 R. E. Bryant and D. R. O’Hallaron. Computer Systems: A Programmer’s Perspective. Prentice-Hall 2002

  22. Y86 Experiments • Processor Variations • Handle data hazards with different stalling and forwarding schemes • Different branch prediction schemes • Creates variety of flushing schedules & modeling details • Models • Everything term level • Bit-vector data, uninterpreted functions • Bit-vector data, partially interpreted functions • Bit-vector, “fully” interpreted functions • Still represent memory and register file as mutable functions

  23. Y86 Relative Verification Times

  24. Observations • Detailed bit-vector model comes at high cost • Biggest problem was modeling ALU XOR operation • Would get much worse for more complex microprocessor • E.g., if model all details of instruction decoding • Using abstraction in the “right” places can greatly reduce verification time

  25. Semiautomatic Abstraction • Generate mixed bit-vector / term model from Verilog • User annotates Verilog with type qualifiers • Variables: Term, Bit Vector • Operations: Uninterpreted, Interpreted • Verifier generates hybrid model • Using type inferencing • Working Assumption • Designers have good intuition about where abstraction can be applied • Over time, will try to automate as much as possible

  26. Progress on Abstraction • Requirements • Type qualifiers: syntax, usage • Type-inference rules • Type-inference algorithms • General Principles Formulated

  27. Conclusions • Hybrid Bit-Vector / Term Modeling Capability • Can use as much or as little abstraction as required • Clear path from RTL to verification • Bit-Vector Decision Procedures • Iterative approach + SAT solvers provide powerful framework • Multiple possible abstraction techniques • Opportunity for parallel processing • Term-Level Abstractions • User provides minimal hints on where abstractions should be applied

More Related