1 / 33

ISPRAS Experience in Model Based Testing

ISPRAS Experience in Model Based Testing. A lexander K. Petrenko , Institute for System Programming of Russian Academy of Sciences (ISPRAS), http://www.ispras.ru. ISPRAS Experience in Industrial Model Based Testing. Why Model Based Testing?.

Download Presentation

ISPRAS Experience in Model Based Testing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ISPRAS Experience in Model Based Testing Alexander K. Petrenko, Institute for System Programming of Russian Academy of Sciences (ISPRAS), http://www.ispras.ru Intel Academic Forum. Budapest, September, 2002

  2. ISPRAS Experience in Industrial Model Based Testing Intel Academic Forum. Budapest, September, 2002

  3. Why Model Based Testing? • Exhaustive testing that covers all implementation paths is impossible. • Exhaustive implementation based (“white box”) testing does not guaranty correct functionality. • White box testing leads to increasing duration of development because test development can be launched only when the implementation is completed. Nevertheless, we like to conduct systematic testing. Formal models propose • basis for systematic testing, we derive from the models • test coverage metrics, • input stimulus, • results correctness criteria. • test development ahead of implementation schedule. Intel Academic Forum. Budapest, September, 2002

  4. Model Checking vs. Model Based Testing Intel Academic Forum. Budapest, September, 2002

  5. Synonyms • Models • (Formal) Specification We consider behavior/functional models. The models provide simplified, abstract view on the target software/hardware. Processing of the models needs their formal description/specification. Intel Academic Forum. Budapest, September, 2002

  6. Generate exhaustive test suites for model of implementation Translate the test suites to implementation level Apply the tests to implementation under test (Optionally) Interpret the testing results in terms of the model Model Based Testing Approach Intel Academic Forum. Budapest, September, 2002

  7. Related Works • IBM Research Laboratory (Haifa, Israel) • Microsoft Research (Redmond, US) Intel Academic Forum. Budapest, September, 2002

  8. Examples of Model Based Testing Applications • IBM Research Laboratory (Haifa, Israel) • Store Date Unit – digital signal processor • API of file system, telephony and Internet protocols etc. • Microsoft Research (Redmond, US) • Universal PnP interface • ISPRAS (Moscow, Russia) • Kernel of operating system (Nortel Networks) • IPv6 protocol (Microsoft) • Compiler optimization units (Intel) • Massive parallel compiler testing (RFBR, Russia) Intel Academic Forum. Budapest, September, 2002

  9. Origin of ISPRAS Methods • 1987-1994Test suite for compiler of real-time programming language for “Buran” space shuttle • 1994 – 1996ISP RAS – Nortel Networks contract onfunctional test suite development for Switch Operating System kernel • Few hundreds of bugs found in the OS kernel, which had been 10 years in use • KVEST technologyAbout 600K lines of Nortel code tested by 2000 Intel Academic Forum. Budapest, September, 2002

  10. ISPRAS Model Based Testing: Two Approaches • UniTesKTesting of Application Program Interfaces (API) based on Software Contract • LamaCompiler testing based on LAnguage Model Application (Lama) Intel Academic Forum. Budapest, September, 2002

  11. UniTesK Testing of Application Program Interfaces (API) Intel Academic Forum. Budapest, September, 2002

  12. What is API? User Interface Application Program Interface (API) Intel Academic Forum. Budapest, September, 2002

  13. Functional Testing UniTesK method deals with functional testing Requirements Formal Specifications Tests To automate testing we provide a formal representation of requirements Intel Academic Forum. Budapest, September, 2002

  14. Phases Techniques Pre- and post-conditions, invariants Implicit Finite State Machines (FSM), data iterators Test coverage metrics based on specification structure Interfacespecification Test scenariodescription Test execution Test result analysis UniTesK Process Intel Academic Forum. Budapest, September, 2002

  15. Test sequence construction Test oracles System under test Decomposition of Testing Tasks • The entire test is a test sequence intended to achieve specified coverage • From specification we can generate oraclesand definetest coverage metrics Intel Academic Forum. Budapest, September, 2002

  16. Test scenario Scenario driver Test engine Specification Data model Test oracle Test coverage tracker System under test Mediator Java/C/C++/C# mediator Legend: Manual Pre-built Generated Automatic derivation Test Suite Architecture Intel Academic Forum. Budapest, September, 2002

  17. f_result = f(x) post_f(x,f_result) verdict = true verdict = false Test Oracle Test Oracle for the method f Specification of method f integer f (a : float) post { post_f (a, f_result) } Intel Academic Forum. Budapest, September, 2002

  18. Test Coverage Metrics Based on Specification Structure • Partition (Derivation of branches and logical terms) • BRANCH “OK” • a -- op1 • !a && b -- op2 • !a && !b && c -- op3 • ... • BRANCH “Bad parameters" • !a && !b && !c && !d • !a && !b && !c && d && !e Specification post { if ( a || b || c || d && e ) { branch “OK“; ..... } else { branch “Bad parameters" ; ..... } } Intel Academic Forum. Budapest, September, 2002

  19. op2 S1 op1 S2 op2 op3 op3 S4 S3 op3 op3 Test Sequence Generation We use FSM to generate test sequences which traverse all equivalence classes defined by partition analysis. But full FSM description is a labor consuming and tedious work. Intel Academic Forum. Budapest, September, 2002

  20. op2 Equivalenceclasses of states SC1 op1 SC2 op2 SC1 op3 op3 SC2 SC4 SC3 op3 SC3 op3 SC4 FSM Construction. Statics • First step of FSM construction: • - state and transition partition based on • pre- and post-condition structure (FSM factorization) • - test input iterators • Partition (Branches and logical terms) • BRANCH “OK” • a -- op1 • !a && b -- op2 • !a && !b && c -- op3 • ... BRANCH "Bad parameters" • !a && !b && !c && !d -– opi • !a && !b && !c && d && !e -– opi+1 Intel Academic Forum. Budapest, September, 2002

  21. op21 SC1 op2 op1 SC1 SC2 op1 op2 SC2 op2 op3 op3 op3 SC4 op3 SC4 SC3 op3 SC3 op3 op3 op3 FSM Construction. Dynamics Second step of FSM construction Result of test execution Intel Academic Forum. Budapest, September, 2002

  22. Model Based Testing: problems of deployment • 1994 – 1996ISP RAS – Nortel Networks contract onfunctional test suite development for Switch Operating System kernel • Few hundreds of bugs found in the OS kernel, which had been 10 years in use • KVEST technologyAbout 600K lines of Nortel code tested by 2000 KVEST had been deployed only in Nortel’s regression testing process. Why? Only few formal techniques used in real life practice. Why? Intel Academic Forum. Budapest, September, 2002

  23. Problems of Model Based Testing Deployment Intel Academic Forum. Budapest, September, 2002

  24. UniTesK Tools and Applications • CTesK – C testing tool -- alpha version • Microsoft IPv6 implementation • J@T – Java testing tool -- beta version • Partially tested by itself • API of parallel debugger of mpC IDE (mpC is a parallel extension of C) • Posix/Win32 File I/O subsystem • VDM++TesK -- free Further steps: C#TesK and C++TesK, (conceivably) VHDLTesK Intel Academic Forum. Budapest, September, 2002

  25. Lama Compiler testing based on Language Models Application Intel Academic Forum. Budapest, September, 2002

  26. Pilot project under contract with Intel Model Based Testing of Compiler Optimization Units Automate optimization unit test generation: • Improve test coverage of the units • Automate test oracle problem Intel Academic Forum. Budapest, September, 2002

  27. Lama approach Lama stands for Compiler testing based on LAnguage Models Application. Lama process steps:Given: a programming language (PL) • Invent a model (simplified) language (ML) of PL • Generate a set of test “programs” in the ML • Map ML test “programs” into PL test programs • Run compiler (or a compiler unit) to process test program in PL andanalyze correctness of the compiler results Intel Academic Forum. Budapest, September, 2002

  28. Optimization background Model language design Model languagebuilding blocks Step 1 Program Language (PL)Specification Iterator development Test “programs” in ML Step 2 Mapper development Test programs in PL Step 3 Faults & test coverage reports Step 4 Test Execution & Test result analysis Process of optimization unit testing Intel Academic Forum. Budapest, September, 2002

  29. IF instruction if condition then block else block Common subexpression Basic block Label Instruction Instruction Step 1 . . . Instruction Transition to label An Example: Common Subexpression Elimination Optimization Intel Academic Forum. Budapest, September, 2002

  30. if ( (('c' - 'a') + (('c' - 'a') > ('c' - 'a'))) ) { (('c' - 'a') + (('c' - 'a') < ('c' - 'a'))); } else { (('c' - 'a') + (('c' - 'a') >= ('c' - 'a'))); } Step 3 Result of translation into C Intel Academic Forum. Budapest, September, 2002

  31. Conclusion Intel Academic Forum. Budapest, September, 2002

  32. Conclusion on UniTesK&Lama • Both UniTesK and Lama follow model based testing approach • Base idea: Testing complex software by means of exhaustive coverage of relatively simple models • Area of applicability: Any software and hardware components with well-defined interfaces or functional properties Intel Academic Forum. Budapest, September, 2002

  33. References • A. K. Petrenko, I. B. Bourdonov, A. S. Kossatchev, V. V. Kuliamin. UniTesK Test Suite Architecture// Proceedings of FME’2002 conference, Copenhagen, Denmark, LNCS, No. 2391, 2002, pp. 77-88. • A.Petrenko. Specification Based Testing: Towards Practice// VI Ershov conference proc., LNCS 2244, 2001. • A. K. Petrenko, Bourdonov, A. S. Kossatchev, V. V. Kuliamin. Experiences in using testing tools and technology in real-life applications. Proceedings of SETT’01, Pune, India, 2001. • I. B. Bourdonov, A. S. Kossatchev, V. V. Kuliamin. Using Finite State Machines in Program Testing// Programming and Computer Software, Vol. 26, No. 2, 2000, pp. 61-73 (English version). • I. Bourdonov, A. Kossatchev, A. Petrenko, and D. Galter. KVEST: Automated Generation of Test Suites from Formal Specifications// Proceedings of World Congress of Formal Methods, Toulouse, France, LNCS, No. 1708, 1999, pp. 608-621. • I. B. Bourdonov, A. S. Kossatchev, V. V. Kuliamin, A. V. Maximov. Testing Programs Modeled by Nondeterministic Finite State Machine. (www.ispras.ru/~RedVerst/ , white papers). Intel Academic Forum. Budapest, September, 2002

More Related