410 likes | 517 Views
Formal Verification: Experiences and Future Prospects. David L. Dill Computer Systems Laboratory Stanford University dill@cs.stanford.edu. De Millo, Lipton, and Perlis. Proofs of programs are too boring for the social process of mathematics to work.
E N D
Formal Verification: Experiences and Future Prospects David L. Dill Computer Systems Laboratory Stanford University dill@cs.stanford.edu
De Millo, Lipton, and Perlis • Proofs of programs are too boring for the social process of mathematics to work. • Verification has high computational complexity. • Verification will lead to overconfidence. • Many programs are not even specifiable. POPL `99
Outline • The verification problem in hardware. • Formal verification methods for hardware • Lessons learned • Formal verification for software • Conclusions POPL `99
The verification problem in hardware POPL `99
Terminology "Verification”: general bug-finding techniques, usually simulation. "Formal Verification”: methods equivalent to 100% testing, often based on logic and/or automata. POPL `99
The problem Chip designs are both large and complex • > 2 Million gates. • Many different types of modules (caches, ALUs, FIFOs, state machines) • Complex control (interacting state machines, stall conditions, exceptions). • Size is growing exponentially. POPL `99
Verification problems Verification teams growing much faster than design teams • 30% - 70% are verification engineers, not designers. Late stage bugs are a major problem • Expensive to fix (redo lots of design work) • Inefficient to find using current techniques • Stretch out time-to-market POPL `99
Current Design Practices Methodology varies greatly among companies. I’ll focus on high-performance ASICs (application-specific semiconductors): • Large number of designs, with big problems • Automatic synthesis from HDL (hardware description languages) • Most verification work not done by designers. • Basic method is HDL simulation. POPL `99
Current Design Practices • Engineers write "reactive testbenches" in HDL. • Input generation • Manual (verification engineers think of test cases) • Pseudo-random • Mixed (some random parameters) • These methods cannot get enough “coverage” to find all the bugs. POPL `99
Functional testing Purgatory Tapeout Typical verification experience Bugs per week Weeks POPL `99
Advanced Verification One option for the future is to do the same thing faster. • Faster simulators • Hardware emulators (e.g. FPGAs) Many people doubt that these can keep up. The hope is that formal verification will provide revolutionary advances. POPL `99
Equivalence checking Checks for mismatches between two gate-level circuits, or between HDL and gate-level (satisfiability). "Formal", because it checks for all input values (solves SAT problem) Acceptance: Widely used ("It's a done deal.") Limitation: Doesn't catch design errors at HDL level. (Analogy: like checking C vs. assembly language.) POPL `99
Protocol Verification Focus on hardware-related protocols. Hardware for multiprocessor interconnect, communications applications is protocol intensive. Tools like Murphi and SPIN have been used on several products. POPL `99
Protocol Verification(cont’d) Example: Murphi caught a serious bug in UltraSparc I cache coherence. Acceptance: limited Limitation: No way to connect to lower-level HDL implementation POPL `99
0 1 Symbolic Model Checking States are encoded as bitvectors. Use BDDs (binary decision diagrams) to represent state spaces symbolically. BDD for (x1x2) x3 POPL `99
Symbolic breadth-first search Each layer of breadth-first search is represented by a BDD POPL `99
Symbolic Model Checking Computes fixed points using BDDs. Examples: SMV (McMillan, CMU & Cadence), FormalCheck (Lucent/Cadence), RuleBase (IBM), Seimens Applied to • Manually abstracted state machines • HDL descriptions Specifications made easier by using canned templates (instead of CTL). POPL `99
Symbolic Model Checking Acceptance: • There have been major successes on some industrial projects. • Use on particular projects in huge companies (e.g. IBM, Intel) • Commercially supported products. • But <1% use overall. POPL `99
Symbolic Model Checking(cont’d) Limitations: • State explosion problems limits to small submodules of hardware. .... but interface is not specified • Changing design may cause unpredictable blowup. POPL `99
Theorem Proving Acceptance has been minimal • But some very impressive industrial successes exist E.g. AMD K7 floating point verification by Russinoff • Also, combined theorem proving and symbolic simulation at Intel (instruction decoder). Limitations: • Requires a lot of human interaction. • Good decision procedures & methodology would help. POPL `99
Lessons learned POPL `99
Lessons Using formal verification is an economic decision. Costs: • Requires expensive, skilled labor. • May delay time-to-market • Users must need formal verification Look where the bugs are: • Interacting state machines • Memory systems (uni/multi-processor) • Floating point POPL `99
Lessons Bug hunting is valuable • Easier: • doesn't require full verification • liberal abstractions work (e.g. downscaling) • may find error before looking at all states • Value is more evident • Designs believed to be bug-free by default. • Cost of bugs is approximately quantifiable (= value of verification) POPL `99
Lessons But bug-hunting is not everything. Proving absence of bugs • In a particular component • Of a particular type is a unique capability of formal verification "When are we done simulating?" POPL `99
Future of Hardware verification There are several new ideas on the horizon • Use of decidable theories • Presburger arithmetic • Cooperating decision procedures • “Semi-formal” methods: hybridize formal and informal techniques for better coverage of larger designs. POPL `99
Design practices Current design practices impose barriers to pervasive use of formal verification: • Lack of high-level specifications • Suboptimal semantics of current HDLs • Poor interface specifications • Separate design and verification staff. … but sudden change cannot be forced POPL `99
Designpracticeswill evolve Design Practices Verification Tools Verification Methodology POPL `99
Software POPL `99
Can we use successes in hardware as a springboard for successes in software? POPL `99
Hardware is not software • Hardware is easier • Resource costs bound complexity (chip area). • Less dynamic (no heap, dynamically created threads). • Payoff is higher (bugs are more costly) • Design implementation debug loop is months, not minutes • Chip fabrication is expensive. • … but more design effort goes into software than hardware (lines of code, programmer hours, $$$) POPL `99
Near-term opportunities • Security (Cryptographic protocols) • Model checking ( Lowe, Clarke, Mitchell, Wing) • Theorem proving (Paulson) • Very important (e.g. e-commerce) • Protocols are reasonably small • Distributed algorithms (Fault tolerance, Synchronization, Agreement) • People are willing to prove them manually. • … but they make mistakes • Computer assistance for case analysis, debugging POPL `99
Near-term opportunities • High-level specifications (Statecharts, UML, RSML, SCR, Z) • Smaller than implementations. • “Most bugs are specification errors” (?) • Bugs can be serious, conceptual problems. • Model checking (NRL, Atlee, UWash) • Satisfiability (Jackson) • “Semantic checking” (Tablewise, NRL) POPL `99
Near-term opportunities • Embedded software is (sometimes) more like hardware than software • Expensive/difficult to upgrade • Resource limited (memory, power, etc). • Sometimes safety-critical or mission-critical POPL `99
Program checking • Program analysis • Check for common run-time errors (e.g. ESC) • Type declaration paradigm for verification conditions • Blend of static analysis, formal verification • Software testing • Automatic testing (Verisoft) • Use of decision procedures for path coverage analysis, test synthesis POPL `99
Program checking • Abstraction + model checking • Reduce program size by slicing, data type abstraction, etc. • Use model checking or other analysis on reduced result • Especially useful for concurrency problems (?) POPL `99
Technologies/Approaches • Decision procedures • Presburger • Nelson/Oppen • Set constraints • Abstraction mechanisms • liberal abstractions for bug hunting • conservative abstractions for proofs • Interaction with programming practices • What to check? • Modularity (avoid analyzing “whole programs”) POPL `99
De Millo, Lipton, and Perlis • Proofs of programs are too boring for the social process of mathematics to work. • Don’t rely on social processes for verification • Verification has high computational complexity. • Almost all problems in hardware are NP-complete or worse (e.g. equivalence checking). • Problems are still solved POPL `99
De Millo,Lipton, and Perlis • Verification will lead to overconfidence. • Purported bugs can be independently confirmed • Users learn the limits of their tools • Many programs are not even specifiable. • Partial specification is useful • Types • User assertions • Exceptions • Memory problems (e.g. Purify) POPL `99
Concluding remarks • The time is ripe for research in software verification • Lessons from hardware can be applied (judiciously) • Maximize benefits, minimize costs. • Look at real problems. • Don’t try to force wholesale methodology shift POPL `99
http://verify.stanford.edu POPL `99