1 / 83

Model-Based Testing and Test-Based Modelling

Q uasimodo. Model-Based Testing and Test-Based Modelling. Jan Tretmans Embedded Systems Institute, Eindhoven, NL and Radboud University, Nijmegen, NL. Overview. Model- Based Testing Model-Based Testing with Labelled Transition Systems Model-Based Testing: A Wireless Sensor Network Node

heidi-sosa
Download Presentation

Model-Based Testing and Test-Based Modelling

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Quasimodo Model-Based Testingand Test-Based Modelling Jan TretmansEmbedded Systems Institute, Eindhoven, NL and Radboud University, Nijmegen, NL

  2. Overview • Model-BasedTesting • Model-Based Testing with Labelled Transition Systems • Model-Based Testing: A Wireless Sensor Network Node • Test-Based Modelling

  3. Software Testing

  4. specification (Software) Testing checking or measuringsome quality characteristicsof an executing objectby performing experimentsin a controlled wayw.r.t. a specification tester SUT System Under Test

  5. Sorts of Testing phases system integration module unit accessibility portability white box black box maintainability efficiency usability reliability functionality aspects

  6. But also: ad-hoc, manual, error-prone hardly theory / research no attention in curricula not cool :“if you’re a bad programmer you might be a tester” Testing is: important much practiced 30-50% of project effort expensive time critical not constructive(but sadistic?) Paradox of Software Testing • Attitude is changing: • more awareness • more professional

  7. Testing Challenges Trends in Software Development • Increasing complexity • more functions, more interactions, more options and parameters • Increasing size • building new systems from scratch is not possible anymore • integration of legacy-, outsourced-, off-the shelf components • Blurring boundaries between systems • more, and more complex interactions between systems • systems dynamically depend on other systems, systems of systems • Blurring boundaries in time • requirements analysis, specification, implementation, testing, installation, maintenance overlap • more different versions and configurations • What is a failure ?

  8. Models

  9. !coffee ?button !alarm ?coin ?button Formal Models (Klaas Smit)

  10. Model-Based Testing

  11. Developments in Testing 1 • Manual testing SUT System Under Test passfail

  12. testcases TTCN TTCN Developments in Testing 2 test execution • Manual testing • Scripted testing SUT passfail

  13. Developments in Testing 3 high-leveltest notation test execution • Manual testing • Scripted testing • High-level scripted testing SUT passfail

  14. Developments in Testing 4 model-basedtest generation systemmodel Testcases TTCN TTCN test execution • Manual testing • Scripted testing • High-level scripted testing • Model-based testing SUT passfail

  15. Model-Based . . . . .Verification, Validation, Testing, . . . . .

  16. Validation, Verification, and Testing ideas ideaswishes validation validation properties model verification abstractmodels,math concreterealizations testing testing SUT

  17. Model-based verification : formal manipulation prove properties performed on model Model-based testing : experimentation show error concrete system Verification and Testing formal world concrete world Verification is only as good as the validity of the model on which it is based Testing can only show the presence of errors, not their absence

  18. Code Generation from a Model A model is more (less)than code generation: • views • abstraction • testing of aspects • verification and validationof aspects met

  19. Model-Based Testingwith Labelled Transition Systems

  20. Model-Based Testing model-basedtest generation systemmodel Testcases TTCN TTCN test execution SUT passfail

  21. iocotest generation LTSmodel Testcases TTCN TTCN LTStest execution SUTbehaving asinput-enabled LTS passfail MBT with Labelled Transition Systems input/outputconformanceioco • set ofLTS tests

  22. !coffee ?button !alarm ?coin ?button Models: Labelled Transition Systems Labelled Transition System:  S, LI, LU, T, s0 initial state states transitions input actions output actions ? = input ! = output

  23. test casemodel ! coin !coffee ! button --- !alarm ?button ?coin ?coffee ?alarm pass ?button Models: Generation of Test Cases specificationmodel fail fail

  24. test casemodel ! button !coffee ! coin --- ?button !alarm ? coffee ?coin ? alarm fail pass fail ?button Models: Generation of Test Cases specificationmodel

  25. pp =  !x LU {} . p!x Conformance: ioco i ioco s =def  Straces (s) : out (i after )  out (s after ) Straces ( s ) = {   ( L  {} )* | s } pafter = { p’ | pp’ } out ( P) = { !xLU | p!x,pP } {  | pp, pP }

  26. Conformance: ioco iioco s =def  Straces(s) : out (iafter )  out (s after ) • Intuition: • i ioco-conforms to s, iff • if i produces output x after trace , then s can produce x after  • if i cannot produce any output after trace , then s cannot produce any output after  (quiescence)

  27. specificationmodel ?dime ioco ioco !coffee !tea ioco ?quart ioco ?dime ?quart ?dime ?dime ?dime ?dime ?dime?quart  !choc !choc !coffee !tea !tea !coffee ?dime?quart Example: ioco

  28. ? x (x < 0) SUT models ? x (x >= 0) ! x ? x ? x (x >= 0) ! y (|yxy–x| < ε) ? x (x < 0) !error ? x (x >= 0) ? x ! -x Example: ioco specificationmodel • LTS and ioco allow: • non-determinism • under-specification • the specification of properties rather than construction

  29. ?dub ?dub ?dub ?dub siocoi ?dub ?dub ?dub ?dub !tea !tea ?dub ?dub ?dub ?dub !tea !coffee !coffee ?dub ?dub ?dub iioco s =def  Straces(s) : out (iafter )  out (s after ) s i iiocos out (iafter ?dub.?dub) =out (safter ?dub.?dub) = { !tea, !coffee } out (iafter ?dub..?dub) = { !coffee } out (safter ?dub..?dub) = { !tea, !coffee }

  30. ?coffee ?tea fail fail ?coffee ?tea pass fail ?coffee ?tea fail fail Test Case !dub test case = labelled transition system • ‘quiescence’ label  • tree-structured • finite, deterministic • final states pass and fail • from each state pass, fail : • either one input !a • or all outputs ?x and  !kwart  ?coffee ?tea fail fail !dub ?coffee ?tea  pass pass fail

  31. 1 end test case pass Test Generation Algorithm: ioco Algorithmto generate a test case t(S)from a transition system state set S, with S  ( initially S = s0 after  ). Apply the following steps recursively, non-deterministically: 3 observe all outputs forbidden outputs allowed outputs ?y ?x 2 supply input !a  allowed outputs fail fail forbidden outputs ?y ?x !a t(S after !x) fail fail allowed outputs (or ): !x out(S) forbidden outputs (or ): !y out(S) t(S after !x) t(S after ?a  )

  32. Example: ioco Test Generation specification test ?dime ?dime !coffee

  33. Example: ioco Test Generation specification test  ?dime ?dime  !coffee 

  34. Example: ioco Test Generation specification test  ?dime ?dime  !coffee 

  35. !dime Example: ioco Test Generation specification test  ?dime ?dime  !coffee 

  36. !dime Example: ioco Test Generation specification test ?coffee  ?tea ?dime ?dime fail fail  !coffee 

  37. !dime Example: ioco Test Generation specification test ?coffee  ?tea ?dime ?dime fail fail  ?coffee ?tea  pass !coffee fail 

  38. !dime Example: ioco Test Generation specification test ?coffee  ?tea ?dime ?dime fail fail  ?coffee ?tea  pass !coffee fail ?coffee  ?tea  pass fail fail

  39. Test Result Analysis: Completeness For every test tgenerated with the ioco test generation algorithm we have: • Soundness :t will never fail with a correct implementationiioco s implies ipasses t • Exhaustiveness :each incorrect implementation can be detectedwith a generated test tiiocos impliest : ifailst

  40. iocotest generation LTSmodel Testcases TTCN TTCN  sound exhaustive LTStest execution SUTbehaving asinput-enabled LTS passfail Completeness of MBT with ioco SUT ioco model input/outputconformanceioco • set ofLTS tests  SUTpassestests

  41. Model-Based TestingMore Theory

  42. S1 S1 S2 S2 environment e environment e Testing Equivalences S1S2  e  E. obs ( e, S1 ) = obs (e, S2 )   ? ?

  43. MBT: Test Assumption Test assumption : SUT . mSUTMODELS .t  TEST . SUT passes t mSUTpasses t SUT mSUT test t test t

  44. s LTS test tool gen : LTS (TTS) iiocos   sound exhaustive SUT t SUT Soundness and Completeness Test assumption : SUTIMP . mSUTIOTS . tTTS. SUTpassest mSUTpasses t Prove soundness and exhaustiveness: mIOTS . ( tgen(s) . mpasses t )  mioco s SUTcomformstos passfail SUT passesgen(s)

  45. ? SUT passes Ts  SUTconforms to s MBT : Completeness SUTpasses Ts  SUTpasses Tsdef tTs . SUTpasses t t Ts . SUTpasses t  test hypothesis:  t TEST . SUT passes t  mSUT passes t t Ts . mSUTpasses t  prove: mMOD. ( t  Ts . m passes t ) m imp s mSUT imp s  define : SUT conforms to s iff mSUTimp s SUTconforms to s

  46. Genealogy of ioco Labelled Transition Systems IOTS ( IOA, IA, IOLTS ) Trace Preorder Testing Equivalences(Preorders) Canonical Testerconf Quiescent Trace Preorder Repetitive QuiescentTrace Preorder(Suspension Preorder) Refusal Equivalence(Preorder) ioco

  47. Variations on a Theme • iiocos   Straces(s) : out ( i after )  out ( s after ) • iiors   ( L  {} )* : out ( i after )  out ( s after ) • iioconfs   traces(s) : out ( i after )  out ( s after ) • iiocoFs   F : out ( i after )  out ( s after ) • iuiocos   Utraces(s) : out ( i after )  out ( s after ) • imiocos multi-channel ioco • iwiocos non-input-enabled ioco • iecoe environmental conformance • isiocos symbolic ioco • i(r)tiocos (real) timed tioco(Aalborg, Twente, Grenoble, Bordeaux,..... ) • iriocos refinement ioco • ihiocos hybrid ioco • iqiocos quantified ioco • ipocos partially observable game ioco • istiocoDs real time and symbolic data • . . . . . .

  48. Model-Based Testing :There is Nothing More Practical than a Good Theory • Arguing about validity of test casesand correctness of test generation algorithms • Explicit insight in what has been tested, and what not • Use of complementary validation techniques: model checking, theorem proving, static analysis, runtime verification, . . . . . • Implementation relations for nondeterministic, concurrent,partially specified, loose specifications • Comparison of MBT approaches and error detection capabilities

  49. Test Selectionin Model-Based Testing

  50. Test Selection • Exhaustiveness never achieved in practice • Test selection to achieve confidencein quality of tested product • select best test cases capable of detecting failures • measure to what extent testing was exhaustive • Optimization problem best possible testing within cost/time constraints

More Related