1 / 27

Applications of Automated Model Based Testing with TorX

Applications of Automated Model Based Testing with TorX. Ed Brinksma Course 2004. TorX Case Studies. Conference Protocol EasyLink TV-VCR protocol Cell Broadcast Centre component ‘’Rekeningrijden’’ Payment Box protocol V5.1 Access Network protocol Easy Mail Melder FTP Client

guido
Download Presentation

Applications of Automated Model Based Testing with TorX

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Applications of AutomatedModel Based Testingwith TorX Ed Brinksma Course 2004

  2. TorX Case Studies Conference Protocol EasyLink TV-VCR protocol Cell Broadcast Centre component ‘’Rekeningrijden’’ Payment Box protocol V5.1 Access Network protocol Easy Mail Melder FTP Client “Oosterschelde” storm surge barrier-control academic Philips CMG Interpay Lucent CMG academic CMG

  3. The Conference Protocol Experiment • Academic benchmarking experiment,initiated for test tool evaluation and comparison • Based on really testing different implementations • Simple, yet realistic protocol (chatbox service) • Specifications in LOTOS, Promela, SDL, EFSM • 28 different implementations in C • one of them (assumed-to-be) correct • others manually derived mutants • http://fmt.cs.utwente.nl/ConfCase

  4. The Conference Protocol Conference Service CPE CPE CPE UDP Layer join leave send receive

  5. Conference ProtocolTest Architecture Tester TorX A B C UT-PCO = C-SAP CPE = IUT U-SAP LT-PCO LT-PCO UDP Layer

  6. The Conference Protocol Experiments • TorX - LOTOS, Promela : on-the-fly ioco testing Axel Belinfante et al.,Formal Test Automation: A Simple ExperimentIWTCS 12, Budapest, 1999. • Tau Autolink - SDL : semi-automatic batch testing • TGV - LOTOS : automatic batch testing with test purposes Lydie Du Bousquet et al.,Formal Test Automation: The Conference Protocol with TGV/TorXTestCom 2000, Ottawa. • PHACT/Conformance KIT - EFSM : automatic batch testing Lex Heerink et al.,Formal Test Automation: The Conference Protocol with PHACTTestCom 2000, Ottawa.

  7. Conference Protocol Results Results: fail pass “core dump” TorXLOTOS 25 3 0 TorXPromela 25 3 0 PHACT EFSM 21 6 1 TGV LOTOS random 25 3 0 TGV LOTOS purposes 24 4 0 pass 000444666 000444666 000444666289293398 000444666 000444666332

  8. Conference Protocol Analysis • Mutants 444 and 666 react to PDU’s from non-existent partners: • no explicit reaction is specified for such PDU’s,so ioco-correct, and TorX does not test such behaviour • So, for LOTOS/Promela with TGV/TorX:All ioco-erroneous implementations detected • EFSM: • two “additional-state” errors not detected • one implicit-transition error not detected

  9. Conference Protocol Analysis • TorX statistics • all errors found after 2 - 498 test events • maximum length of tests : > 500,000 test events • EFSM statistics • 82 test cases with “partitioned tour method” ( = UIO ) • length per test case : < 16 test events • TGV with manual test purposes • ~ 20 test cases of various length • TGV with random test purposes • ~ 200 test cases of 200 test events

  10. EasyLink Case Study TV VCR EasyLink • protocol between TV and VCR • simple, but realistic • features: • preset download • WYSIWYR (what you see is what you record) • EPG download • ...  object of testing communication

  11. EasyLink Test Architecture MBB (= Magic Black Box) allows to monitorcommunicationbetween TV and VCR by PC allows PC to send messagesto mimic TV or VCR TorX distributedover PC and workstation MBB VCR TV RC manual inter-action PC Work Station

  12. Testing Preset Download Feature • What? • check whether TV correctly implementspreset download based on Promela specification • How? • let PC play role of VCR and initiate preset download • receive settings from TV • WHILE (TRUE) { let PC initiate preset download let PC non deterministically stop preset download check for consistency in presets } • feature interaction:shuffle presets on TV using RC all under control of PC

  13. EasyLink Experiences test environment influences what can be tested testing power is limited by functionality of MBB initially, state of TV is unknown tester must be prepared for all possible states some “hacks” needed in specification and tool architecture in order to decrease state space automatic specification based testing feasible tool architecture also suitable to cope with user interaction some (non fatal) non-conformances detected Results:

  14. CMG - CBC Component Test • Test one component of Cell Broadcast Centre • LOTOS (process algebra) specification of 28 pp. • Using existing test execution environment • Based on automatic generation of “adapter” based on IDL • Comparison (simple): existing test TorX • code coverage 82 % 83 % • detected mutants/10 5 7 • Conclusion: • TorX is as least as good as conventional testing(with potential to do better) • LOTOS is not nice (= terrible) to specify such systems

  15. Interpay ‘’Rekeningrijden’’Highway Tolling System

  16. “Rekeningrijden” Characteristics : • Simple protocol • Parallellism : • many cars at the same time • Encryption • Real-time issues • System passed traditional testing phase

  17. ‘’Rekeningrijden’’ :Phases for Automated Testing • IUT study • informal and formal specification • Available tools study • semantics and openness • Test environment • test architecture, test implementation, SUT specification • testing of test environment • Test execution • test campaigns, execution, analysis

  18. ‘’Rekeningrijden’’ Highway Tolling System Payment Box (PB) Road Side Equipment Onboard Unit UDP/IP Wireless

  19. ‘’Rekeningrijden’’: Test Architecture I PaymentBox TorX spec PB PCO

  20. ‘’Rekeningrijden’’: Test Architecture II PaymentBox TorX Test Context spec PB + UDP/IP SUT UDP/IP IAP PCO

  21. ‘’Rekeningrijden’’: Test Architecture III spec TorX Test Context PaymentBox ObuSim PB + ObuSim + TCP/IP + UDP/IP PCO IAP SUT UDP/IP TCP/IP

  22. ‘’Rekeningrijden’’: Test Campaigns • Introduction and use of Test Campaigns : • Management of test tool configurations • Management of IUT configurations • Steering of test derivation • Scheduling of test runs • Archiving of results

  23. ‘’Rekeningrijden’’: Issues • Parallellism : • very easy • Encryption : • Not all events can be synthesized :Leads to reduced testing power • Real-time : • How to cope with real time constraints ? • Efficient computation for on-the-fly testing ? • Lack of theory: quiescence vs. time-out

  24. ‘’Rekeningrijden’’ Problem:Quiescence in ioco vs. time-out Input Input Observe Input Observe Input tq tq Quiescence Tick Timeout Timeout Timeout Timeout TorX PB TorX PB Spec := Spec + Tick

  25. Input01 Input0 Input0 Input0 Input1 Timeout Input1 Input1 Full Unexpected Error TorX PB TorX PB ‘’Rekeningrijden’’ Problem:Action Refinement Spec := Refine + Buffer

  26. ‘’Rekeningrijden’’: Issues • Modelling language: LOTOS  Promela • Spec for testing  Spec for validation • Development of specification is iterative process • Development of test environment is laborious • Parameters are fixed in the model • Preprocessing: M4/CPP • Promela problem: Guarded inputs • Test Campaigns for bookkeeping and control of experiments • Probabilities incorporated

  27. ‘’Rekeningrijden” : Results • Test results : • 1 error during validation (design error) • 1 error during testing (coding error) • Automated testing : • beneficial: high volume and reliability • many and long tests executed ( > 50,000 test events ) • very flexible: adaptation and many configurations • Step ahead in formal testing of realistic systems

More Related