1 / 37

February, 2000

Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP RAS), Moscow. February, 2000. Ideal Testing Process. Why formal specification?. Design specifications. Forward engineering:. ? Oracles.

billy
Download Presentation

February, 2000

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Automated Generation of Test Suites from Formal SpecificationsAlexander K.PetrenkoInstitute for System Programming of Russian Academy of Sciences (ISP RAS), Moscow. February, 2000

  2. Ideal Testing Process Why formal specification? Design specifications Forward engineering: ? Oracles ? Criteria Sources Partition Tests ? Oracles Reverse engineering: ? Criteria Sources Partition Tests What kind of specifications? Pre- and post-conditions, for Oracles and Partition invariants ? Algebraic specifications for Test sequences Cambridge, February, 2000

  3. Ideal Testing Process Why formal specification? Design specifications Oracles Forward engineering: Criteria Sources Partition Tests Oracles Post-specifications Reverse engineering: Criteria Sources Partition Tests What kind of specifications? Pre- and post-conditions, for Oracles and Partition invariants ? Algebraic specifications for Test sequences Cambridge, February, 2000

  4. KVEST project history • Started under contract with Nortel Networks in 1994 to develop a system automatically generating test suites for regression testing from formal specifications, reverse engineered from the existing code • A joint collaboration effort between Nortel Networks and ISPRAS background: • Soviet Space Mission Control Center OS and networks; • Soviet space shuttle “Buran” OS and real-time programming language; • formal specification of the real-time programming language Cambridge, February, 2000

  5. What is KVEST? • KVEST: Kernel Verification and Specification Technology • Area of Application: specification, test generation and test execution for API like OS kernel interface • Specification Language: RAISE/RSL (VDM family) • Specification Style: state-oriented, implicit (pre- and post-conditions, subtype restrictions) • Target Language: Programming language like C/C++ • Size of Application: over 600Kline • Size of specification: over 100Kline • Size of test suites: over 2Mline • Results: over hundred errors have been detected in several projects Cambridge, February, 2000

  6. Position • Constraint specification • Semi-automated test production • Fully automated test execution and test result analysis • Orientation on use in industrial software development processes Cambridge, February, 2000

  7. Test system architecture Mapping between specification and programming languages Integration of generated and manual components - re-use of manual components Test sequence and test case generation Research and design problems Cambridge, February, 2000

  8. Reverse engineering: (post-) specification, testing based on the specification Forward engineering: specification design, development, test production Co-verification: specification design, simultaneous development and test production Verification processes Cambridge, February, 2000

  9. Reverse engineering: Technology stream Phase 1 Interfacedefinition Software contractcontents Interface A1 ……………………. Interface A2 ……………………. Documentation Source code Phase 2 Specification Actual documentation Phase 3 Test driversand test cases Test suiteproduction Phase 4 Detected error & test coverage reports Test plans Test executionanalysis Cambridge, February, 2000

  10. Key features of KVEST test suites • Phase 1: Minimal and orthogonal API (Application Programming Interface) is determined • Phase 2: Formal specification in RAISE Specification Language is developed for API. • Phase 3: Automatic generation of sets of test suites (test cases and test sequences) in target language. • Phase 4: Automatic execution of generated test suites. Pass/fail verdict is assigned for every test case execution. Error summary is provided at the end of the run. User has an option of specifying completeness of the test coverage and the form of tracing. Cambridge, February, 2000

  11. An example of specification in RAISE DAY_OF_WEEK : INT >< INT -~-> RC >< WEEKDAY DAY_OF_WEEK( tday, tyear ) as ( post_rc, post_Answer ) post if tyear <= 0 \/ tday <= 0 \/ tday > 366 \/ tday = 366 /\ ~a_IS_LEAP( tyear ) then BRANCH( bad_param, "Bad parameters" ); post_Answer = 0 /\ post_rc = NOK else BRANCH( ok, "OK" ); post_Answer = (a_DAYS_AFTER_INITIAL_YEAR(tyear, tday ) + a_INITIAL_DAY_OF_WEEK ) \ a_DAYS_IN_WEEK /\ post_rc = OK end Cambridge, February, 2000

  12. Partition based on the specification • Partition (Branches and Full Disjunctive Normal Forms - FDNF) • BRANCH "Bad parameters” • a/\b/\c/\d/\e • ~a/\b/\c/\d/\e • ... • BRANCH "OK" • ~a/\~b/\~c/\~d/\e • … Specification post if a \/ b \/ c \/ d /\ e then BRANCH( bad_param, "Bad parameters" ) else BRANCH( ok, "OK" ) end Cambridge, February, 2000

  13. Specifications Test suite generators UNIX Test harness Test case parameters Test drivers Program behavior model SUT Target platform Comparison Verdict and trace Test execution scheme Cambridge, February, 2000

  14. Test execution management Unix workstation Target platform Test suite Navigator: - test suite generation - repository browser - test plan run Script driver MDC Basic drivers Repository Test bed: - process control - communication - basic data conversion MDC - Manually Developed Components Cambridge, February, 2000

  15. KVEST Test Drivers • Hierarchy of Test Drivers • Basic test drivers: test single procedure by receiving input, calling the procedure, recording the output, assigning a verdict • Script drivers: generate sets of input parameters, call basic drivers, evaluate results of test sequences, monitor test coverage • Test plans: define the order of script driver calls with given test options and check their execution • KVEST uses set of script driver skeletons to generate script drivers • Test drivers are compiled from RAISE into the target language Cambridge, February, 2000

  16. Script driver skeletons RAISE specifications Basic drivergenerator Script drivergenerator Tools (UNIX) Test case generator RAISE -> target language compiler Target platform Basic drivers Script drivers Test caseparameters Test suites Test generation scheme Cambridge, February, 2000

  17. Script driver skeletons State observers Data converters Iterators Filters Test generation scheme, details Manually Developed Components RAISE specifications Basic drivergenerator Test case generator Tools (UNIX) Script drivergenerator RAISE -> target language compiler Targetplatform Basic drivers Script drivers Test caseparameters Test suites Cambridge, February, 2000

  18. Test sequence generation based on implicit Finite State Machine (FSM) • Partition based on pre- and post-conditions • Implicit FSM definition. op2 S1 op1 S2 op2 op3 op3 S4 S3 op3 op3 Cambridge, February, 2000

  19. op21 S1 op1 S2 op2 op3 op3 S4 S3 op3 op3 Test sequence generation based on implicit FSM • Partition (Branches and Full Disjunctive Normal Forms - FDNF) • BRANCH "Bad parameters” • a/\b/\c/\d/\e -- op1 • ~a/\b/\c/\d/\e -- op2 • ... • BRANCH "OK" • ~a/\~b/\~c/\~d/\e -- opi • … Cambridge, February, 2000

  20. Code inspection during formal specification can detect up to 1/3 of the errors Code inspection can not replace testing. Up to 2/3 of the errors are detected during and after testing. Testing is necessary to develop correct specifications. Up to 1/3 of the errors were caused by the lack of knowledge on pre-conditions and some details of the called procedures’ behavior. Conclusion on KVEST experience Cambridge, February, 2000

  21. What part of testware is generated automatically? Percen-tage in the sources Kind of source for Ratio between test generation source size and Kind of generated tests generation result size Specification 50 1:5 Basic drivers Data converters, 50 1:10 Script drivers Iterators and State observers ( MDC) Cambridge, February, 2000

  22. Solved and unsolved problems in test automation Have been automated or simple problems Not automated and not simple Phase 1 Interfacedefinition For well designed For legacy software Phase 2 Specification For single operations Phase 3 Test sequence design for operation groups Test oracles, partition, filters Test suiteproduction Phase 4 Test plans, execution and analysis, browsing, reporting Test executionanalysis Test resultunderstanding Cambridge, February, 2000

  23. Specification based testing: problems and prospects Problems • Lack of correspondence between any specification and programming languages • There is users’ resistance to study any specification language and any additional SDE • Methodology of Test sequence generation • Testing methodologies for specific software areas Prospects • Use an OO programming language specification extension and standard SDE instead a specific SDE • FSM extraction from implicit specification, FSM factorization • Research on Distributed software specification and testing Cambridge, February, 2000

  24. Part II. KVEST revision

  25. Formal methods deployment problems lack of users with theoretical background lack of tools non conventional languages and paradigms UniTesK Solutions first step is possible without “any theory” extension of C++ and Java integration with standard software development environment Related works ADL/ADL2 Eiffel, Larch, iContract Specification notation revision.UniTesK: Universal TEsting and Specification toolKit Cambridge, February, 2000

  26. UniTesK: Test generation scheme Path builderengines Specifications in Java or C++ extension Iterators, FSM Test oraclesgenerator Use cases Tools OO test suitegenerator Test suites in the target language Test oracles Test sequence fabric Iterators, FSM Target platform Cambridge, February, 2000

  27. A standard Software Development Environment UML based design environment Specification, Verification tools for the standard notation Integration of Constraint Verification tools into software development environment Cambridge, February, 2000

  28. Part III. Test generation inside

  29. All branches All disjuncts (all accessible disjuncts) Requirements. Test coverage criteria Specification post if a \/ b \/ c \/ d /\ e then BRANCH( bad_param, "Bad parameters" ) else BRANCH( ok, "OK" ) end • Partition (Branches and Full Disjunctive Normal Forms - FDNF) • BRANCH "Bad parameters” • a/\b/\c/\d/\e • ~a/\b/\c/\d/\e • ... • BRANCH "OK" • ~a/\~b/\~c/\~d/\e • … Cambridge, February, 2000

  30. Test sequence kinds. Kinds 1st, 2nd, 3rd • Such procedures can be tested separately because no other target procedure is needed to generate input parameters and analyze outcome. • Kind 1. The input is data that could be represented in literal (text) form and can be produced without accounting for any interdependencies between the values of different parameters.. • Kind 2. No interdependencies exist between the input items (values of input parameters). Input does not have to be in literal form. • Kind 3. Some interdependencies exist, however separate testing is possible. Cambridge, February, 2000

  31. Kinds 1st, 2nd, 3rd. What are automated? Cambridge, February, 2000

  32. Test sequence kinds. Kinds 4th and 5th • Kinds 4th and 5th.The operations cannot be tested separately, because some input can be produced only by calling another operation from the group and/or some outcome can be analyzed only by calling other procedures. Cambridge, February, 2000

  33. Requirements for kinds 4th and 5th • The same requirements: all branches/all disjuncts • Additional problem: how to traverse all states? Cambridge, February, 2000

  34. FSM use for API testing • Traditional FSM approach (explicit FSM definition): • define all states • for each state define all transitions (operation, input parameters, outcome, next state) • ISPRAS approach (implicit FSM definition): • the state is defined by type definition • for each state • - operations and input are defined by pre-conditions • - outcome and next state are defined by post-conditions Cambridge, February, 2000

  35. Advanced FSM use • FSM factorization • Optimization of exhaustive FSM traversing • Use-case based test sequence generation • Test scenario modularization • Friendly interface for test sequence generation and debugging Cambridge, February, 2000

  36. Igor Bourdonov, Alexander Kossatchev, Alexander Petrenko, and Dmitri Galter. KVEST: Automated Generation of Test Suites from Formal Specifications.- Proceedings of World Congress of Formal Methods, Toulouse, France, LNCS, N 1708, 1999, pp.608-621. Igor Burdonov, Alexander Kosachev, Victor Kuliamin. FSM using for Software Testing. Programming and Computer Software, Moscow-New-York, No. 2, 2000. References Cambridge, February, 2000

  37. Contacts Alexander PetrenkoInstitute for System Programming of Russian Academy of Sciences (ISP RAS), Moscow, Russiapetrenko@ispras.ru phone: +7 (095) 912-5317 ext 4404fax: +7 (095) 912-1524 http://www.ispras.ru/~RedVerst/index.html Cambridge, February, 2000

More Related