formal methods of systems specification logical specification of hard and software n.
Skip this Video
Loading SlideShow in 5 Seconds..
Formal Methods of Systems Specification Logical Specification of Hard- and Software PowerPoint Presentation
Download Presentation
Formal Methods of Systems Specification Logical Specification of Hard- and Software

Loading in 2 Seconds...

play fullscreen
1 / 19

Formal Methods of Systems Specification Logical Specification of Hard- and Software - PowerPoint PPT Presentation

  • Uploaded on

Formal Methods of Systems Specification Logical Specification of Hard- and Software. Prof. Dr. Holger Schlingloff Institut für Informatik der Humboldt Universität and Fraunhofer Institut für Rechnerarchitektur und Softwaretechnik. Specification Based Testing. Last week: assertion languages

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Formal Methods of Systems Specification Logical Specification of Hard- and Software' - marilu

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
formal methods of systems specification logical specification of hard and software

Formal Methods of Systems SpecificationLogical Specification of Hard- and Software

Prof. Dr. Holger Schlingloff

Institut für Informatik der Humboldt Universität


Fraunhofer Institut für Rechnerarchitektur und Softwaretechnik

specification based testing
Specification Based Testing
  • Last week: assertion languages
    • Anna for Ada
    • OCL for UML
    • Java Modeling Language (JML) for Java
    • Spec# for C#
    • PSL for VHDL
    • ACSL for C (see Frama-C)
  • No huge success (yet)
    • verification burdon upon programmer
    • increases time & cost of programming
  • Use of specifications for testing?
    • Specifications are written by testers for testing
    • Implementation (IUT) is a black box - only executable
systems development process
Systems Development Process
  • Test spec: „Spec# model program“
  • Test model: „FSM model exploration“
  • Test case: „scenario“


System Spec

Test Spec

Test Model

System Model

System Impl.

Test Cases

test specification
Test Specification
  • Spec# model program serves as
    • executable specification
      • for simulation/animation of intended behaviour
    • test generator
      • test models are obtained from model program by abstraction
    • test oracle
      • assertion of safety properties, pre / postconditions
  • Extra effort to derive!!!
    • pays off
example calculator















Example: Calculator
  • [Action] means an interface to the user or the SUT (system under test)
    • „Unit of behaviour“, may change state (information of a system, value of state variables)
  • requires as a declarative contract
    • invariants, quantifiers, …
    • full Spec#-language available!
spec explorer tool
Spec Explorer Tool

Specification based testing

  • Development of Spec# model programs
    • "literate programming" editor, debugger
  • Validation of Spec# model programs
    • simulation (Main execution)
    • exploration (FSM generation)
    • visualization
    • safety (reachability), liveness analysis
    • static analyses (BoogiePL)
  • Test case generation
    • scenarios from explored FSMs
    • complete coverage of spec or stochastic testing
  • Test execution
    • offline test case generation for conformance testing
    • on-the-fly testing with spec as test oracle
spec explorer artefacts

1) Model


Provides expected results for



7) Log of







by Spec Explorer

test run





4) Test


(possible runs as


finite state machine)



Are run by

Visualized by

results for

5) User



3) Graph views


6) Implementation


(API driver)

under test

Spec Explorer Artefacts

…\Spec Explorer\doc\SpecExplorerReference.doc

test spec vs system spec
Test Spec vs. System Spec
  • System spec: used to derive the implementation
    • transformational development
    • correctness of derivation steps
    • assertion checks can be switched on or off
  • Test spec: aimed at testing and validating the SUT
    • investigate properties of model programs wrt SUT
    • generate and execute test suites
    • assertions as test oracle
  • Different intentions, different levels of abstraction!
modeling abstraction
Modeling: Abstraction
  • Minimum code needed to generate scenarios of interest –no need to be comprehensive
  • Adequate level of abstraction (state variables, actions in SUT to test):
    • Global point of view, each agent can see each other agents' state
    • All state information (files, messages, …) in model variables
    • Each action (at chosen level of abstraction) is coded as a method in the model (need not correspond 1:1 to methods in IUT)
    • Model program has a single thread, interleaving actions can represent concurrency
    • Actions are atomic, no interleaving within action bodies
    • Multiple assignments within an action can represent parallelism
  • Model introduces new state space, reconcile with SUT
modeling coding
Modeling: Coding
  • To code each action
    • When is it enabled? (requires ...)
      • multiple actions enabled in the same state models nondeterminism
      • allows for interleaving concurrency
    • What (if anything) does it return? (return ...)
    • What is next state? Is it different? (assignments, ... = ... )
  • Distinguish top level [Action] methods from helper methods
  • Possibly write Main method(s) to simulate scenario(s)
example s
  • Stack
  • Counting Problem (Prisoner‘s Dilemma)
test design
Test design
  • Generating test suites from Spec# model
  • Rationale: find equivalence classes of behaviour;no need for exhaustive testing
  • Different fault types call for different kinds of tests
    • Wrong logic/wrong expression
      • Complete but minimal coverage over small domains
    • Problem scaling up data structures (like hash table resize, editor buffer gap)
      • Vary a few properties over large ranges
    • Unreliable infrastucture, hidden state leaks out
      • Long test cases, revisit the same (model) states
  • Exploration generates a finite state machine (FSM) from the model program for
    • validation (visualization, check safety and liveness), and
    • offline test case generation
  • Exploration executes the model program in a special environment, building the FSM as it goes.
    • each invocation (method call including args) is a transition in the FSM
    • execute all enabled invocations from a state (backtracking, in effect)
    • execute each method with all combinations of arguments from given finite domain (can simulate internal nondeterminism with additional arguments).
  • Generated FSM is an underapproximation of the model program
    • can be nondeterministic
exploration algorithm
Exploration algorithm
  • Exploration treats model program state as first class.
    • Spec# compiler generates code with storage management hooks
    • explorer creates set of hyperstates as approximation of sets of states
    • executes the actions of the given spec on concrete states of that spec and building up the hyperstates
    • end state of a new transition is added to the frontier if the transition is relevant (is an improvement towards goal)
  • Abstraction vs. Exploration
    • abstraction (e.g., hiding of variables) yields over-approximation (more transitions than „really“)
    • model exploration yields under-approximation
test case generation
Test case generation
  • Offline test case generation: traverse FSM generated by exploration
    • Different traversal algorithms achieve different coverage
      • Postman tour gives minimal transition coverage (not path coverage)
    • Identify "accepting states" where test run may terminate
    • Identify "cleanup actions" that make progress toward accepting state
    • Tool ensures each test case reaches accepting state (via cleanup actions)
  • Tool can store test suite internally for subsequent conformance testOR tool can write out test suite as C# program
conformance testing
Conformance testing
  • Tool can act as test harness for conformance testing
  • Tool can reference and execute IUT (binary, DLL)
  • Model and IUT can be at different levels of abstraction, must reconcile model state space with IUT state space
    • Write wrapper or test driver around IUT
    • Wrapper can translate IUT values to model values
    • [Probe] actions can return (translated) IUT state variables
  • Action bindings, type bindings defined in configuration
  • Object bindings made dynamically
  • Lockstep execution, model with IUT, check:
    • actions are enabled
    • correct return values