1 / 44

Model-Based Testing with Labelled Transition Systems

Model-Based Testing with Labelled Transition Systems. specification. Testing. Testing: checking or measuring some quality characteristics of an executing object by performing experiments in a controlled way w.r.t. a specification. tester. IUT. Types of Testing. Level of detail. system.

Download Presentation

Model-Based Testing with Labelled Transition Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Model-Based Testingwith Labelled Transition Systems

  2. specification Testing Testing: checking or measuring some quality characteristicsof an executing objectby performing experimentsin a controlled wayw.r.t. a specification tester IUT

  3. Types of Testing Level of detail system integration module unit Accessibility portability white box black box maintainability efficiency usability reliability functionality Characteristics

  4. IUTconftomodel test tool test generation tool IUT passes tests TTCN testcases TTCN test execution tool passfail Automated Model-Based Testing Automated Model-Based Testing model IUT confto model IUT 

  5. Towards Model Based Testing • Increase in complexity, and quest for higher quality software • testing effort grows exponentially with complexity • testing cannot keep pace with development • More abstraction • less detail • model-based development • Checking quality • practice: testing - ad hoc, too late, expensive, lot of time • research: formal verification - proofs, model checking, . . . . with disappointing practical impact

  6. Towards Model Based Testing • Model based testing has potential to combine • practice - testing • theory - formal methods • Model Based Testing : • testing with respect to a (formal) model / specification state model, pre/post, CSP, Promela, UML, Spec#, . . . . • promises better, faster, cheaper testing: • algorithmic generation of tests and test oracles : tools • formal and unambiguous basis for testing • measuring the completeness of tests • maintenance of tests through model modification

  7. informal world formalizable validation formalverification world of models real world testing A Model-Based Development Process informalrequirements specification design model-based code realization

  8. m modcheck s model checker m sat s specification s sat m modcheck s model m of i m modcheck s   sound complete formal world real world implementationi Formal Verification sat  assumption: m is valid model of i

  9. correctness of test generation ? specification s test generation IUTconfto s  confto IUTpasses Ts test suite TS formal world test execution real world IUT passes Ts implementation i IUT fails Ts implementationi Formal Testing IUT

  10. Approaches to Model-Based Testing Several modelling paradigms: • Finite State Machine • Pre/post-conditions • Labelled Transition Systems • Programs as Functions • Abstract Data Type testing • . . . . . . . Labelled Transition Systems

  11. model tests IUT test execution pass / fail Model-Based Testing for LTS Involves: • model / specification • implementation IUT + models of IUTs • correctness • tests • test generation • test execution • test result analysis testgeneration confto

  12. initial states0 S states transitionsT  S  (L{})  S actions !coffee !alarm ?coin ?button ?button Models of Specifications:Labelled Transition Systems Labelled Transition System  S, L, T, s0

  13. ?kwart ?dub?kwart ?dub ?dub !coffee !choc !tea !tea ?dub?kwart ?dub ?dub ?dub ?dub ?kwart !coffee !coffee !choc !tea Example Models (Input-Enabled) Transition Systems

  14. CorrectnessImplementation Relationioco iiocos =defStraces (s) : out (iafter )  out (safter) Intuition: i ioco-conforms to s, iff • if i produces output x after trace , then s can produce x after  • if i cannot produce any output after trace , then s cannot produce any output after  (quiescence)

  15. pp =  !x LU{} . p!x Straces ( s ) = {   (L{})* | s } pafter = { p’ | p p’ } out ( P) = { !xLU | p!x, pP } { | pp, pP } CorrectnessImplementation Relationioco iiocos =defStraces (s) : out (iafter )  out (safter)

  16. ?kwart ioco ioco ?dub?kwart ioco ioco ?dub ?dub s !coffee !choc !tea !tea ?dub?kwart ?dub ?dub ?dub ?dub ?kwart  !coffee !coffee !choc !tea Implementation Relation ioco

  17. !dub !kwart  ?coffee ?tea fail fail !dub ?coffee ?tea  pass pass fail Test Cases Model of a test case = transition system : • ‘quiescence’ label  • tree-structured • finite, deterministic • final states pass and fail • from each state pass, fail : • either one input !a • or all outputs ?x and 

  18. 1 end test case 3 observe output pass forbidden outputs allowed outputs ?y ?x  2 supply input fail fail T(S after !x) !a allowed outputs or : !xout(S) forbidden outputs or : !y out(S) T(S after ?a  ) ioco Test Generation Algorithm Algorithm To generate a test case from transition system specification s0compute T(S), with S a set of states, and initially S = s0 after ; For T(S), apply the following recursively, non-deterministically:

  19. ?dub ?dub !dub  !coffee ?coffee   ?tea ?coffee  ?tea pass fail pass fail fail Test Generation Example s test

  20. i test ?dub ?dub !dub ?dub ?coffee ?dub  !coffee ?tea ?dub pass fail ?coffee  ti passi' ?tea ti pass fail pass i' fail dub  dub coffee  Test Execution Example Two test runs : i passes t

  21. Test Result AnalysisCompleteness of ioco Test Generation For every test t generated with algorithm we have: • Soundness :twill never fail with correct implementationiiocos implies i passest • Exhaustiveness:each incorrect implementation can be detectedwith a generated testtiiocos implies t : ifailst

  22. s  LTS TsTTS IUT IMPS iIUTIOTS exec : TESTS  IMPS (OBS) passes : IOTS TTS {pass,fail} pass / fail Formal Testing with Transition Systems Test hypothesis : IUTIMP . iIUT IOTS . tTTS . IUT passes t  iIUT passes t gen : LTS(TTS) = ioco Proof soundness and exhaustiveness: iIOTS . ( tgen(s) . i passes t ) iiocos

  23. Variations on a Theme iiocos  Straces(s) : out ( iafter ) out ( safter) i ior s  ( L {} )* : out ( iafter ) out ( safter) iioconfs  traces(s) : out ( iafter ) out ( safter) iiocoFs F: out ( iafter ) out ( safter) imiocos multi-channel ioco iuiocos universal ioco iwiocos non-input-enabled ioco isiocos symbolic ioco i(r)tiocos (real) timed tioco (Aalborg, Twente, Grenoble, Bordeaux,. . . .) iiocors refinement ioco ihiocos hybrid ioco . . . . . .

  24. Extensions Status test case with data ?coin1  n: int  ? money ?coin2 ! money ? ?coin3 and action refinement [ n  35 ] -> [ n  50 ] -> ? button1 ? button2 ! button2 Vc := 0 c := 0 c := 0 Vt := 0 dVt/dt = 3 dVc/dt = 2 c < 10 c < 15  ? coffee [Vt= 15 ] -> [ c  5 ] -> [Vc= 10 ] -> ! coffee ! tea ? tea fail fail pass Testing Transition Systems: model and time and hybrid

  25. but ?but ?ok ?but Xy Xy ?but ?but okerr okerr   but !x ?but !err ?but !x !y ?but ?ok ?err ?but ?but ?ok ?err ?ok ?err !err !ok !x !y ?ok ?err ?ok ?err Component Based Testing i1iocos1 i2 ioco s2 i1||i2 ioco s1||s2

  26. Compositional TestingComponent Based Testing i1 s1 i1 ioco s1 i2 s2 i2 ioco s2 i1 || i2 ioco s1 || s2 If s1, s2 input enabled - s1, s2  IOTS - then ioco is preserved !

  27. ?a ?a ?a ?b ?b !x !y !z Variations on a Theme: uioco iiocos  Straces(s) : out ( iafter ) out ( s0after) s0 out ( s0after?b ) = but ?b  Straces(s) : under-specification :anything allowed after ?b s1 s2 out ( s0after?a ?a ) = { !x }and ?a ?a  Straces(s)but from s2 , ?a ?a is under-specified : anything allowed after ?a ?a ?

  28. s0 ?a ?a s1 s2 ?a ?b ?b Utraces(s) = {  Straces (s) | 1?a2=,s': s1s'  s'?a } !x !y !z Variations on a Theme: uioco iuiocos  Utraces(s) : out ( iafter ) out ( s0after) Now s is under-specified in s2 for ?a :anything is allowed. ioco  uioco

  29. ?a LI LILU ?a   ?a ?b ?b !x !y !z Variations on a Theme: uioco iuiocos  Utraces(s) : out ( iafter ) out ( s0after) s0 ?b  s1 s2 ?a  Alternatively, via chaos process for under-specified inputs

  30. methodinvocation methodcall methodreturn methodinvocations IUTcomponent IUTcomponent IUTcomponent methodinvocations || methodcalled methodreturned IUTcomponent methodsinvoked Testing Components

  31. methodreturn methodcall specification s  LTS(LI, LU) IUTcomponent tester methodreturn methodcall Testing Components LI = offered methods calls  used methods returns LU= offered methods returns  used methods calls

  32. Input-enabledness:s of IUT, ?a LI : ? methodreturn methodcall IUTcomponent ?a s tester methodcall methodreturn Testing Components No ! ?

  33. saftermusta? =  s’ ( s s’  s’ a? ) CorrectnessImplementation Relationwioco iuiocos =defUtraces (s) : out (iafter )  out (safter) iwiocos =defUtraces (s) : out (iafter )  out (safter) and in (iafter )  in (safter) in (safter ) = { a?  LI | safter musta? }

  34. s  LTS TsTTS IUT IMPS iIUTIOTS exec : TESTS  IMPS (OBS) passes : IOTS TTS {pass,fail} pass / fail Formal Testing with Transition Systems Test hypothesis : IUTIMP . iIUT IOTS . tTTS . IUT passes t  iIUT passes t gen : LTS(TTS) = ioco Proof soundness and exhaustiveness: iIOTS . ( tgen(s) . i passes t ) iiocos

  35. IUT ?a s Test Assumption output x? LU IUT input a?LI quiescence  Sequencing of inputs, outputs, and  : Input-enabledness: s of IUT, ?a LI : IUTbehaves as an IOTS (input-enabled LTS)

  36. S1 S1 S2 S2 environment environment Comparing Transition SystemsTesting Equivalences • Suppose an environment interacts with the systems: • the environment tests the system as black boxby observing and actively controlling it; • the environment acts as a tester; • Two systems are equivalent if they pass the same tests.

  37. Formal Testing : Test Assumption Test assumption : IUT . iIUTMOD.tTEST . IUTpassestiIUTpassest IUT iIUT test t test t

  38. ? IUT passes Ts  IUTconfto s Completeness of Formal Testing IUTpasses Ts  t Ts . IUTpasses t  IUTpasses Tsdef t Ts . IUTpasses t t Ts . iIUTpasses t  Test hypothesis :  t  TEST . IUT passes t  iIUT passes t Proof obligation : i MOD . ( t  Ts . i passes t ) i imp s iIUT imp s  Definition : IUTconfto s IUTconfto s

  39. ioco ?dub ?dub s !coffee !choc !tea !tea Test Assumption More tests may be needed, starting in initial state:  meta-assumption: reliable restart

  40. ioco ?dub !choc !tea ?dub ?dub ioco !choc !tea Alternative Test Assumption Test: 1. do ?dub2. make core dump 3. make many copies of core dump 4. continue test with each copy An “Ambramsky”-test can distinguish them

  41. ?dub ?dub ioco ?dub ?kwart ?kwart ?kwart ?kwart !choc !tea !choc !tea ioco Alternative Test Assumption With test ?dub.?kwart.undo you can distinguish them

  42. Concluding • Testing can be formal, too (M.-C. Gaudel, TACAS'95) • Testing shall be formal, too • A test generation algorithm is not just another algorithm : • Proof of soundness and exhaustiveness • Definition of test assumption and implementation relation • For labelled transition systems : • ioco for expressing conformance between imp and spec • a sound and exhaustive test generation algorithm • tools generating and executing tests:TGV, TestGen, Agedis, TorX, . . . .

  43. Perspectives Model based formal testing can improve the testing process : • model is precise and unambiguous basis for testing • design errors found during validation of model • longer, cheaper, more flexible, and provably correct tests • easier test maintenance and regression testing • automatic test generation and execution • full automation : test generation + execution + analysis • extra effort of modelling compensated by better tests

  44. Thank You

More Related