1 / 21

Multi-Paradigm Models as Source for Automatic Test Construction

Multi-Paradigm Models as Source for Automatic Test Construction. Victor Kuliamin ISP RAS, Moscow. Requirements. Functionality. Reliability. Efficiency. Usability. Why Multiple Models?. ?. Testing. Modeling Techniques. Operational Can be executed by virtual machine

Download Presentation

Multi-Paradigm Models as Source for Automatic Test Construction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multi-Paradigm Models as Source for Automatic Test Construction Victor Kuliamin ISP RAS, Moscow

  2. Requirements Functionality Reliability Efficiency Usability Why Multiple Models? ? Testing

  3. Modeling Techniques • OperationalCan be executed by virtual machine • ContractPre- and postconditions, data integrity constraints • History-basedConstraints on possible traces • AlgebraicEquivalence between different execution histories (C)(E)FSM, LTS, PN, CSP, ASM SDL, LOTOS, Lustre, VDM, Murphi, Simulink Z, B, ADL, JML, Eiffel, VDM, RSL Larch-C++ TL, MSC Larch, ML, OBJ

  4. Tasks of Testing Software under Test Construct Single Test Input Test Results Gather Responses Organize Bundle of Test Inputs Transform Test Inputs and Responses Evaluate Correctness Evaluate Testing Quality

  5. Modeling Techniques Comparison Behavior Evaluation Closeness to Requirements • Operational • Contract • History-based • Algebraic High-level Coverage Scalability Low-level Coverage Concurrency Test Sequence Construction Single Input Construction

  6. Comparison Results • There is no the best technique • No one technique is good for everything • May be a mix of different approaches can fit more needs?

  7. UniTesK Technology Model-based testing technology Developed in 2000 – 2002 in ISP RAS

  8. UniTesK Solutions • Contract specifications of behavior • FSM and LTS testing models

  9. Functional Requirements Contract Specifications Preconditions and postconditions of interface operations and asynchronous reactions Data integrity constraints • Close to requirements • Suitable for oracle generation • Provide low-level coverage criteria Contract Specifications

  10. ? ! ? ? ! ? ! ? FSM and LTS Testing Models Contract Specifications Define states and admissible input actions More abstract than original specifications • Guarantee some low-level coverage • Suitable for test sequence construction • Provide high-level coverage criteria ! Coverage Requirements

  11. Relation between Models parameters operation domain 2 3 coverage goals 1 states

  12. Whole Picture I Coverage Model Testing Model Model of Behavior Software under Test Test Oracle Test Sequence Construction

  13. Whole Picture II Operation Operation Scenario method pre post Operation Operation Scenario method pre post Event Event pre post Data State Calculation Data model invariants Software under Test Model of Behavior Testing Model Coverage Model

  14. Tool Demo

  15. Set of Integers – Scenario I 1 2 3 0 7 5 1 3 5 2 States of behavior model States of FSM model

  16. Mapping Abstract Call to Specific parameters 2 3 1 states current state

  17. Set of Integers – Scenario II 1 2 0 7 1 3 5 2 States of FSM model = States of behavior model

  18. Failure { -2147483648, 2147483647 } /false Add ( -715827883 )

  19. References • V. Kuliamin, A. Petrenko, N. Pakoulin, I. Bourdonov, and A. Kossatchev. Integration of Functional and Timed Testing of Real-time and Concurrent Systems. Proc. of PSI 2003. LNCS, Springer-Verlag, 2003. • V. Kuliamin, A. Petrenko, I. Bourdonov, and A. Kossatchev. UniTesK Test Suite Architecture. Proc. of FME 2002. LNCS 2391, pp. 77-88, Springer-Verlag, 2002. • A. K. Petrenko, I. B. Bourdonov, A. S. Kossatchev, V. V. Kuliamin. Experiences in using testing tools and technology in real-life applications. Proceedings of SETT’01, India, Pune, 2001 • I. B. Bourdonov, A. S. Kossatchev, V. V. Kuliamin. Using Finite State Machines in Program Testing. "Programmirovanije", 2000, No. 2 (in Russian). Programming and Computer Software, Vol. 26, No. 2, 2000, pp. 61-73 (English version) • I. Bourdonov, A. Kossatchev, A. Petrenko, and D. Galter. KVEST: Automated Generation of Test Suites from Formal Specifications. Proceedings of World Congress of Formal Methods, Toulouse, France, LNCS, No. 1708, 1999, pp. 608-621 • http://www.ispras.ru/groups/rv/rv.html

  20. Contact Victor V. Kuliamin E-mail: kuliamin@ispras.ru 109004, B. Kommunisticheskaya, 25 Moscow, Russia Web: http://www.ispras.ru/groups/rv/rv.html Phone: 007-095-9125317 Fax: 007-095-9121524

  21. Thank you!

More Related