1 / 26

Verification tools at Microsoft

Verification tools at Microsoft. K. Rustan M. Leino Research in Software Engineering ( RiSE ) Microsoft Research, Redmond, WA, USA. 15 January 2009 Séminaire Digiteo Orsay , France. RiSE. Research in Software Engineering Microsoft Research, Redmond http://research.microsoft.com/rise

morpheus
Download Presentation

Verification tools at Microsoft

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Verification tools at Microsoft K. Rustan M. Leino Research in Software Engineering (RiSE)Microsoft Research, Redmond, WA, USA 15 January 2009SéminaireDigiteoOrsay, France

  2. RiSE • Research in Software Engineering • Microsoft Research,Redmond • http://research.microsoft.com/rise • Related groups: PPT (MSR Cambridge) and RSE (MSR India)

  3. Software engineering research • Goal • Better build, maintain, and understand programs • How? • Specifications • Tools, tools, tools • Program semantics • Verification-condition generation, symbolic execution, model checking, abstract interpretation, fuzzing, test generation • Satisfiability Modulo Theories (SMT)

  4. Verified Software Initiative • Hoare, Joshi, Leavens, Misra, Naumann, Shankar, Woodcock, et al. • “We envision a world in which computer programs are always the most reliable component of any system or device that contains them” [Hoare & Misra]

  5. Structure of talk • Spec# demo • Various techniques and RiSE tools • Use/effectiveness of tools at Microsoft

  6. Spec# programming system[Barnett, Fähndrich, Leino, Müller, Schulte, Venter, et al.] • Research prototype • Spec# language • Object-oriented .NET language • Superset of C# 2.0, adding: • more types (e.g., non-null types) • specifications (e.g., pre- and postconditions) • Usage rules (methodology) • Checking: • Static type checking • Run-time checking • Static verification (optional)

  7. Spec# demo

  8. Specifications: .NET today StringBuilder.Append Method (Char[], Int32, Int32) Appends the string representation of a specified subarray of Unicode characters to the end of this instance. publicStringBuilderAppend(char[] value, intstartIndex, intcharCount); Parameters value A character array. startIndex The starting position in value. charCount The number of characters append. Return Value A reference to this instance after the append operation has occurred. Exceptions

  9. Specifications in Spec# publicStringBuilderAppend(char[] value, intstartIndex,intcharCount ); requires value == null ==> startIndex == 0 && charCount == 0; requires 0 <= startIndex; requires 0 <= charCount; requires value == null ||startIndex + charCount <= value.Length;ensuresresult == this;

  10. Specifications with Code Contracts publicStringBuilderAppend(char[] value, intstartIndex,intcharCount){ Contract.Requires(value != null|| (startIndex== 0 && charCount == 0)); Contract.Requires(0 <= startIndex); Contract.Requires(0 <= charCount); Contract.Requires(value == null ||startIndex+ charCount <= value.Length); Contract.Ensures(Contracts.Result<StringBuilder>() == this); // method implementation...} Note that postcondition is declared at top of method body, which is not where it should be executed.A rewriter tool moves these.

  11. Code Contracts [Barnett, Fähndrich, Grunkemeyer, et al.] • Declarative contracts • Language independent • Library to ship in .NET 4.0 • Tools to be released via DevLabs • Code Contracts Rewriter (for run-time checking) • Clousot abstract interpreter • Pex automated testing tool

  12. Spec# verifier architecture Spec# Spec# compiler MSIL (“bytecode”) Translator Boogie Inference engine V.C. generator verification condition SMT solver “correct” or list of errors

  13. Boogie – a verification tool bus[Barnett, Jacobs, Leino, Moskal, Rümmer, et al.] Spec# C with HAVOC specifications C with vcc specifications Dafny Chalice Your language here Boogie-to-Boogie transformations: • Inference engines • Program transformations • Logic optimizers Boogie Your prover here Isabelle/HOL Simplify Z3 SMT Lib

  14. Verification-condition generation • Verification conditions computed by weakest preconditions (wp) • wp( Prog, Q ) yields a formula that describes the pre-states from which Prog correctly establishes Q • Example:wp( if (B) { S } else { T }, Q ) = (B wp(S, Q))  (¬B wp(T, Q))

  15. Traditional VC generation • Example program (Prog):p := new C(); if (x < 0) { x := -x; } assert p ≠ null; • wp( Prog, true )= ((x<0  (p≠null)[-x/x])  (¬(x<0)  p≠null))[newC()/p]= ((x<0  newC()≠null)  (¬(x<0)  newC()≠null)

  16. Improved VC generation[Flanagan, Saxe, Barnett, Leino] • Rewrite Proginto Prog’: assume p0 = newC(); if (x0 < 0) { assume x1 = -x0; assume x2= x1; } else { assume x2 = x0; } assert p0≠ null; • wp( Prog’, true ) = p0=newC() ((x0<0  x1= -x0 x2 = x1)  (¬(x0<0)  x2 = x0))  p0 ≠ null

  17. Problem with improved schemes • Works well when the if branches modify variables that the downstream assertion does not depend on • But when encoding the heap as one variable, almost every branch modifies that variable • … room for new solutions

  18. Multi-object invariants[Barnett, Fähndrich, Leino, Müller, et al.] • Demo: Chunker.dict

  19. Multi-object invariants :Chunker :Chunker :Classroom n: 84 n: 20 invstudentGrades.Count ≤ 20; invdict.Count ≤ n; invdict.Count ≤ n; rep dict: dict: studentGrades: owner :Dictionary Count: 21

  20. Other heap methodologies • Spec#/Boogie methodology • Dynamic frames • Implicit dynamic frames • Separation logic • … room for improved encodings and methodologies

  21. Clousot[Fähndrich, Logozzo] • Abstract interpreter for .NET • Verifies Code Contracts at compile time • Some key technology: • Heap-aware abstraction • Iterative application of numerical domains: • Pentagons • Subpolyhedra • others

  22. Pentagons • Some common abstract domains: • Intervals x  [A,B] • Octagons  x  y ≤ K • PolyhedraΣi xi≤ K • Observation: • Checking array accessesinvolves constraints like0 ≤ x < a.Length • These can be representedby intervals plus variableorderings y ≤ x Pentagon: Picture source: Robert Webb's Great Stella software, http://www.software3d.com/Stella.html

  23. Symbolic-powered testing • Sage [Godefroid, Levin, et al.] • White-box fuzzing for C programs • Pex[de Halleux, Tillman, et al.] • Automatic white-box testing for .NET Seed input New generation of symbolically derived input

  24. Z3[Bjørner, de Moura] • Satisfiability Modulo Theories (SMT) solver • 9 first places and 6 second places atSMT-COMP’08 • Used in all tools mentioned, except Clousot

  25. Effectiveness of tools • Static Driver Verifier (SDV) • Applied regularly to all Microsoft device drivers of the support device models • ~300 bugs found • Available to third parties in Windows DDK • Sage • Applied regularly • 100s of people doing various kinds of fuzzing • HAVOC • Has been applied to 100s of KLOC • ~40 bugs in resource leaks, lock usage, use-after-free • vcc • Being applied to Microsoft Hypervisor • …

  26. Conclusions • Machine-processable specifications are being used increasingly • Tools are useful and necessary • Provide useful checking • Both validate and drive research • SMT solving is a key technology • Trend: user input is moving toward program text • Many research challenges • http://research.microsoft.com/rise

More Related