1 / 42

Data Mutation Testing

Data Mutation Testing. -- A Method for Automated Generation of Structurally Complex Test Cases. Hong Zhu Dept. of Computing and Electronics, Oxford Brookes Univ., Oxford, OX33 1HX, UK Email: hzhu@brookes.ac.uk. Outline. Motivation

mathilda
Download Presentation

Data Mutation Testing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Mutation Software Testing Data Mutation Testing -- A Method for Automated Generation of Structurally Complex Test Cases Hong Zhu Dept. of Computing and Electronics, Oxford Brookes Univ., Oxford, OX33 1HX, UK Email: hzhu@brookes.ac.uk

  2. Outline • Motivation • Overview of existing work on software test case generation • The challenges to software testing • The Data Mutation Testing Method • Basic ideas • Process • Measurements • A Case study • Subject software under test • The mutation operators • Experiment process • Main results • Perspectives and future works • Potential applications • Integration with other black box testing methods Data Mutation Software Testing

  3. Motivation • Test case generation • Need to meet multiple goals • Reality: to represent real operation of the system • Coverage: functions, program code, input/output data space, and their combinations • Efficiency: not to overkill, easy to execute, etc. • Effective: capable of detecting faults, which implies easy to check the correctness of program’s output • Externally useful: help with debugging, reliability estimation, etc. • Huge impact on test effectiveness and efficiency • One of the most labour intensive tasks in practices Data Mutation Software Testing

  4. Existing Work • Program-based test case generation • Static: analysis of code without execution, e.g. symbolic execution • Path oriented Howden, W. E. (1975, 1977, 1978); Ramamoorthy, C., Ho, S. and Chen, W. (1976) ; King, J. (1975) ; Clarke, L. (1976) ; Xie T., Marinov, D., and Notkin, D. (2004); J. Zhang. (2004), Xu, Z. and Zhang, J. (2006) • Goal oriented DeMillo, R. A., Guindi, D. S., McCracken, W. M., Offutt, A. J. and King, K. N. (1988) ; Pargas, R. P., Harrold, M. J. and Peck, R. R. (1999); Gupta, N., Mathur, A. P. and Soffa, M. L. (2000); • Dynamic: through execution of the program Korel, B. (1990) , Beydeda, S. and Gruhn, V. (2003) • Hybrid: combination of dynamic execution with symbolic execution, e.g. concolic techniques Godefroid, P., Klarlund, N., and Sen, K.. (2005); • Techniques: • Constraint solver, Heuristic search, e.g. genetic algorithms: McMinn, P. and Holcombe, M. (2003) , Survey: McMinn, P. (2004) Data Mutation Software Testing

  5. Specification-based test case generation • Derive from either formal or semi-formal specifications of the required functions and/or the designs • Formal specification-based: • First order logic, Z spec and Logic programs: : Tai, K.-C. (1993);Stocks, P. A. and Carrington, D. A. (1993) ; Ammann, P. and Offutt, J. (1994) ; Denney, R. (1991) • Algebraic specification: Bouge, L., Choquet, N., Fribourg, L. and Gaudel, M.-C. (1986) ; Doong, R. K. and Frankl, P. G. (1994) ; Chen, H. Y., Tse, T. H. and Chen, T. Y. (2001) ; Zhu (2007); • Finite state machines: Fujiwara, S., et al.. (1991) ; Lee, D. and Yannakakis, M. (1996) ; Hierons, R. M. (2001) ; Zhu, H., Jin, L. & Diaper, D. (1999) ; • Petri nets: Morasca, S. and Pezze, M. (eds). (1990) ; Zhu, H. and He, X. (2002) • Model-based: derive from semi-formal graphic models • SSADM models: Zhu, H., Jin, L. and Diaper, D. (1999, 2001); • UML models: Offutt, J. and Abdurazik, A. (2000) ; Tahat, L. H., et al. (2001); Hartman, A. and Nagin, K. (2004); Li, S., Wang, J. and Qi, Z.-C. (2004) ; • Techniques: • Constraint solving; Theorem prover; Model-checker Data Mutation Software Testing

  6. Random testing • Through random sampling over input domain based on probabilistic models of the operation of the software under test. • Profile-based: Sampling over an existing operation profile at random • Stochastic model based: Use a probabilistic model of software usages • Markov chain: Avritzer, A. Larson, B. (1993) ; Avritzer, A. Weyuker, E. J. (1994) ; Whittaker, J. A. and Poore, J. H. (1993) ; Guen, H. L., Marie, R. and Thelin, T. (2004) ; Prowell, S. J. (2005) • Stochastic automata networks: Farina, A. G., Fernandes, P. and Oliveira, F. M. (2002, 2004) ; • Bayesian networks: Fine, S. and Ziv, A. (2003) • Adaptive random testing: Even spread of randomly test cases (Chen, T. Y., Leung, H. and Mak, I. K. (2004) ) • Variants: Mirror, Restricted, and Probabilistic ART Data Mutation Software Testing

  7. Domain-specific techniques • Database applications • Zhang, J., Xu, C. and Cheung, S. C. (2001) • Spreadsheets: • Fisher, M., Cao, M., Rothermel, G., Cook, C. and Burnett, M. (2002) • Erwig, M., Abraham, R., Cooperstein, I., and Kollmansberger S. (2005) • XML Scheme: • Lee, S. C. and Offutt, J. (2001) ; Li, J. B. and Miller, J. (2005) • Compiler: • See Boujarwah, A. S. and Saleh, K. (1997) for a survey. Data Mutation Software Testing

  8. The Challenge How to generate adequate test cases of high reality for programs that process structurally complex inputs? • Structural complexity: • A large number of elements • A large number of possible explicitly represented relationships between the elements • A large number of constraints imposed on the relationships • Meaning of the data depends on not only the values of the elements, but also the relationships and thus their processing • Reality: • Likely or close to be a correct real input in the operation of the system • Likely or close to be an input that contains errors that a user inputs to the system in operation • Examples: CAD, Word processor, Web browser, Spreadsheets, Powerpoint, Software modelling tools, Language processor, Theorem provers, Model-checkers, Speech recognition, Hand writing recognition, Search engine,… Data Mutation Software Testing

  9. Basic Ideas of Data Mutation Testing • Preparing the seeds, i.e. a small set of test cases • Contain various types of elements and relationships between them • Highly close to the real input data • Easy to check their correctness • Generating mutant test cases by modifying the seeds slightly • Preserve the validity of the input • Change at one place a time unless imposed by the constraints (but may use second order even higher order mutants) • Make as many different mutants as possible • Executing the software under test on both seeds and their mutants • What to observe: • program’s correctness on both seeds and mutants • the differences of the program’s behaviours on seed and their mutants • Uses of metrics and measurements • seeds are sufficient • mutations are effective and/or sufficient • Feedback to step 1 and 2 if necessary, or to improve the observation. Data Mutation Software Testing

  10. ID Input Expected output t1 (x=5, y=5, z=5) Equilateral t2 (x=5, y=5, z=7) Isosceles t3 (x=5, y=7, z=9) Scalene t4 (x=3, y=5, z=9) Non-triangle Illustrative Example The lengths of the sides • Triangle classification • Input: x, y, z: Natural Numbers; • Output: {equilateral, isosceles, scalene, non-triangle} • Seeds: The type of triangles Data Mutation Software Testing

  11. Mutation operators • IVP: Increase the value of a parameter by 1; • DVP: Decrease the value of a parameter by 1; • SPL: Set the value of a parameter to a very large number, say 1000000; • SPZ: Set the value of a parameter to 0; • SPN: Set the value of a parameter to a negative number, say -2; • WXY: Swap the values of parameters x and y; • WXZ: Swap the values of parameters x and z; • WYZ: Swap the values of parameters y and z; • RPL: Rotate the values of parameters towards left; • RPR: Rotate the values of parameters towards right. Data Mutation Software Testing

  12. Generation of mutant test cases • For example, by applying the mutation operator IVP to test case t1 on parameter x, we can obtain the following test case t5. IVP(t1, x) =t5 = Input: (x=6, y=5, z=5). • Total number of mutants: (5*3 +5)*4 = 80 • Covering all sorts of combinations of data elements • Systematically produced from the four seeds Data Mutation Software Testing

  13. Execution of program and classification of mutants • A mutant is classified as dead, if the execution of the software under test on the mutant is different from the execution on the seed test case. Otherwise, the mutant is classified as alive. • For example For a correctly implemented Triangle Classification program, the execution on the mutant test case t5 will output isosceles while the execution on its seed t1 will output equilateral. TrC(t5) TrC(t1)  t5 is dead It depends on how you observe the behaviour! Data Mutation Software Testing

  14. Analyse test effectiveness • Reasons why a mutant can remain alive: • The mutant is equivalent to the original with respect to the functionality or property of the software under test. RPL(t1)=t1 • The observation on the behaviour and output of the software under test is not sufficient to detect the difference RPL(t2)= t6 = Input: (x=5, y=7, z=5). • The software is incorrectly designed and/or implemented so that it is unable to differentiate the mutants from the original. Same output, but different execution paths for a correct program. Data Mutation Software Testing

  15. Measurements of Data Mutation Number of equivalent mutants • Equivalent mutant score EMS: A high equivalent mutant score EMS indicates that the mutation operators have not been well designed to achieve variety in the test cases. • Live mutant score LMS: A high LMS indicates that the observation on the behaviour and output of the software under test is insufficient. • Typed live mutant score LMSF , where F is a type of mutation operators A high LMSF reveals that the program is not sensitive to the type of mutation probably because a fault in design or implementation. Total number of mutants Number of life mutants Data Mutation Software Testing

  16. Process of Data Mutation Testing Data Mutation Software Testing

  17. Analysis of Program Correctness • Can data mutation testing be helpful to the analysis of program correctness? • Consider the examples in Triangle Classification: • IVP or DVP to test case t1, we can expect the output to be isosceles. • For the RPL, RPR, WXY, WYZ, and WYZ mutation operators, we can expect that the program should output the same classification on a seed and its mutant test cases. • If the software’s behaviour on a mutant is not as expected, an error in the software under test can be detected. Data Mutation Software Testing

  18. Case Study • The subject • CAMLE: Caste-centric Agent-oriented Modelling Language and Environment • Automated modelling tool for agent-oriented methodology • Developed at NUDT of China • Potential threats to the validity of the case study • Subject is developed by the tester • The developer is not professional software developer • Validation of the case study against the potential threats • The test method is black box testing. The knowledge of the code and program structure affect the outcomes. • The subject was developed before the case study and no change at all was made during the course to enable the case study to be carried out. • In software testing practice, systems are often tested by the developers. • The developer is a capable master degree student with sufficient training at least equivalent to an average programmer. • The correctness of the program’s output can be judges objectively. Data Mutation Software Testing

  19. Complexity of the Input Data • Input: models in CAMLE language • Multiple views: • a caste diagram that describes the static structure of a multi-agent system, • a set of collaboration diagrams that describe how agents collaborate with each other, • a set of scenario diagrams that describe typical scenarios namely situations in the operation of the system, and • a set of behaviour diagrams that define the behaviour rules of the agents in the context of various scenarios. • Well-formedness constraints • Each diagram has a number of different types of nodes and arcs, etc. • Each diagram and the whole model must satisfy a set of well-formedness conditions to be considered as a valid input (e.g. the types of nodes and arcs must match with each other) Data Mutation Software Testing

  20. Table 1. Summary of CAMLE’s Consistency Constraints Horizontal Consistency Vertical Consistency Local Global Intra-model Intra-diagram 10 - - Inter-diagram 8 8 - Inter-model 4 1 4 The Function to Be Tested • Consistency checker • Consistency constraints are formally defined in first order logic • Potential threat to the validity • The program is not representative. • Validation of the case study • The program’s input is structurally complex • The program is non-trivial Data Mutation Software Testing

  21. No. Operator type Description 1 Add diagram Add a collaboration or behaviour or scenario diagram 2 Delete diagram Delete an existing diagram 3 Rename diagram Change the title of an existing diagram 4 Add node Add a node of some type to a diagram 5 Add node with edge Add a node and link it to an existing node 6 Add edge Add an edge of some type to a diagram 7 Replicate node Replicate an existing node in a diagram 8 Delete node Delete an existing node in a diagram 9 Rename node Rename an existing node in a diagram 10 Change node type Replace an existing node with a new node of another type 11 Add sub diagram Generate a sub-collaboration diagram for an existing node 12 Delete env node Delete an existing env node in a sub-collaboration diagram Types of Data Mutation Operators Data Mutation Software Testing

  22. Data Mutation Software Testing

  23. The Seed Test Cases • Models developed in previous case studies of agent-oriented software development methodology • The evolutionary multi-agent Internet information retrieval system Amalthaea (originally developed at MIT media lab); • Online auction web service; • The agent-oriented model of the United Nation’s Security Council on the organisational structure and the work procedure to pass resolutions at UNSC. • All seeds passed consistency check before the case study started • No change was made to these seeds in this case study Data Mutation Software Testing

  24. Amalthaea Auction UNSC Total Caste Diagram #Diagrams 1 1 1 3 #Nodes 9 7 3 19 # Edges 7 6 4 17 Collaboration Diagram #Diagrams 3 5 4 12 #Nodes 15 14 8 37 #Edges 26 17 6 49 Behaviour Diagram #Diagrams 8 6 2 16 #Nodes 112 115 43 270 #Edges 67 75 28 170 Scenario Diagram #Diagrams 2 1 0 3 #Nodes 22 4 0 26 #Edges 10 1 0 11 Total #Diagrams 14 13 7 34 #Nodes 158 140 54 352 #Edges 110 99 38 257 Number of Mutants 3466 3260 1082 7808 The Seed Test Cases and Their Mutants Data Mutation Software Testing

  25. The Results: Fault Detecting Ability Fault Type No. of Inserted Faults No. of Detected Faults By seeds By mutants Inserted Indigenous Domain Missing path 12 5 (42%) 12 (100%) 2 Path selection 17 8 (47%) 17 (100%) 2 Computation Incorrect variable 24 14 (58%) 21 (88%) 0 Omission of statements 31 13 (42%) 31 (100%) 0 Incorrect expression 15 9 (60%) 14 (93%) 1 Transposition of statements 19 12 (63%) 19 (100%) 0 Total 118 61 (52%) 114 (97%) 5 Data Mutation Software Testing

  26. Seed #Mutant #Dead #Alive %Dead Amalthaea 3466 697 2769 20.11% Auction 3260 422 2838 12.94% UNSC 1082 167 915 15.43% Total 7808 1286 6522 16.47% Detecting Design Errors • In the case study, we found that a large number of mutants remain alive Table. The numbers of alive and dead mutants • Review: Three possible reasons: • improper design of data mutation operators, • insufficient observation on the behaviour and output • defects in the software under test. Data Mutation Software Testing

  27. Operator type #Total #Dead #Live %Dead Add diagram 3 2 1 67% Delete diagram 9 2 7 22% Rename diagram 9 9 0 100% Add node 88 14 74 16% Combine node 61 39 22 64% Add edge 1378 173 1205 13% Replicate node 130 0 130 0% Delete node 147 37 110 25% Rename node 123 77 46 63% Change node type 61 24 37 39% Add sub diagram 8 8 0 100% Delete environment node 4 4 0 100% Rename environment node 4 4 0 100% Delete annotation on node 39 0 39 0% Replicate edge 22 0 22 0% … … … … … Statistics on Amalthaea test suite Some typed mutation score is very low Design of consistency checker has errors! Especially, the consistency constraints are weak. Data Mutation Software Testing

  28. Table. The statistics of alive and dead mutants after modification Seed #Mutant #Dead #Alive %Dead Amalthaea 3065 2692 373 87.83% Auction 3095 2579 516 83.33% UNSC 992 821 171 82.76% Total 7152 6092 1060 85.18% Results: Detecting Design Errors • Hypothesis • Design of the tool is weak in detecting certain types of inconsistency or incompleteness • Validation of the hypothesis • Strengthening the well-formedness constraints • Strengthening the consistency constraints: 3 constraints modified • Introducing new completeness constraints: 13 new constraints introduced • Test again using the same seeds and the same mutation operators • A significant change in the statistic data is observed. Data Mutation Software Testing

  29. Test Adequacy • Our experiments show that high test adequacy can be achieved through data mutation. • Coverage of input data space • Measured by the coverage of various kinds of mutants • Coverage of program structure • Measured by code coverage (equivalent to the branches covered) • Coverage of the functions of the requirements • Measured by the consistency constraints used in checking • Two factors the determines the test adequacy: • the seeds • the mutation operators Data Mutation Software Testing

  30. Mutation operator type Amalthaea Auction UNSC Total 1 1 1 1 3 2 2 1 0 3 3 2 1 0 3 4 14 7 0 21 5 14 3 0 17 6 24 0 0 24 7 20 4 0 24 8 20 4 0 24 9 20 4 0 24 10 8 2 0 10 14 8 2 0 10 16 10 1 0 11 17 40 0 0 40 Coverage of scenario diagram variants Data Mutation Software Testing

  31. Coverage of Program Structure and Functions The test data achieved 100% coverage of the functions of the consistency checker and 100% of the branches in the code. Data Mutation Software Testing

  32. Test Cost Table. Summary of the test cost spent in the case study The seeds were readily available from previous case studies of the tool. Data Mutation Software Testing

  33. Mutant No. Operator /Location Expected Output Violated Constraint Message ID #Messages, Message Content 1 Add a new Collaboration diagram / Top of model 1 E003 5, (Agent nodes in the main collaboration diagram) 2 E004 6, (Caste nodes in the main collaboration diagram) 5 E016 14, (Interaction edges in the main collaboration diagram) Analysis Program’s Correctness • The experiment took the black-box approach • The output on a test case consists of • Whether the input (a model) is consistent and complete • The error message(s) and/or warning message(s), if any • The expected output on a mutant is specified Data Mutation Software Testing

  34. Experiments • The experiments • Mutants are selected at random • The program’s correctness on each mutant is checked manually • Time is measured for how long it needs to check the correctness of the program on each test case • Two experiments were conducted • Experiment 1 • 1 mutant selected at random from each set of the mutants generated by one type of mutation operator (24 mutants in total) • Detected 2 faults in the checker and 1 fault in other parts of the tool • Experiment 2 • 22 live mutants from the Amalthaea suite selected at random • Detected 2 faults in the other parts of the tool Data Mutation Software Testing

  35. Type of Mutant Aliveness #Mutants #Detected Faults Equivalent Alive 34 2 Equivalent Dead 0 0 Non-equivalent Alive 1 1 Non-equivalent Dead 11 2 The Experiment Data • Results: • Checking correctness on dead mutants: 3 minute/per mutant • Checking correctness on live mutants: 1 minute/per mutant Data Mutation Software Testing

  36. Related Works • Mutation testing • Program or specification is modified • Used as a criteria to measure test adequacy • Data mutation testing adopted the idea of mutation operators, but applied to test cases to generate test case, rather than to measure adequacy. • Meek and Siu (1989) • Randomisation in error seeding into programs to test compiler • Adaptive Random Testing (Chen, et al. 2003, 2004) • Random test cases as far apart as possible • Not yet applied to structurally complex input space • Data perturbation testing (Offutt, 2001) • Test XML message for web services • As a application specific technique and applicable to XML files • Metamorphic testing (Chen, Tse, et al. 2003) • As a test oracle automation technique and focus on the metamorphic relations rather than to generate test cases • Could be integrated with data mutation method Data Mutation Software Testing

  37. Future Work • More case studies with potential applications • Security control software: Role-Base Access Control • Input: Role model, User assignments <Roles, Resources, Permissions: RoleResources, Constraints  Roles X Resources X Permissions> Userassignments: Users  P(Roles) • Virus detection • Input: files infected by virus • Virus are programs in assembly/binary code format • One virus may have many variants obtained by equivalent transformation of the code. • Spreadsheet processing software and spreadsheets applications • Input: spreadsheets <data cells, program cells> Data Mutation Software Testing

  38. Perspectives and Future Work • Integration of data mutation testing, metamorphic testing and algebraic testing methods Let be the program under test Data mutation testing generates test cases using a set of data mutation operators Metamorphic testing used a set of metamorphic relations to check output correctness We can use i to define metamorphic relations as follows: Data Mutation Software Testing

  39. Example Consider the Triangle Classification program P • The following is a metamorphic relation P(t)= equilateral  P(IPV(t))= isosceles • For each of the data mutation operatorsf= WXY, WXZ, WYZ, RPL, or RPR, the following is a metamorphic relation P(f(t))=P(t) We observed in case study that data mutation operators are very helpful to find metamorphic relations. Data Mutation Software Testing

  40. Integration with Algebraic Testing • In algebraic software testing, axioms are written in the form of T1=T’1 ^ T2=T’2 ^ … ^ Tn=T’n => T=T’, Where Ti, T’i are terms constructed from variables and function/procedure/methods of the program under test. • The integration of data mutation testing, metamorphic testing and algebraic testing by developing • A black box software testing specification language • An automated tool to check metamorphic relations • Using observation context to check if a relation is true • To allow user defined data mutation operators to be invoked • To allow metamorphic relations to be specified Data Mutation Software Testing

  41. Screen Snapshot of Algebraic Testing Tool CASCAT Data Mutation Software Testing

  42. References • Lijun Shan and Hong Zhu, Generating Structurally Complex Test Cases by Data Mutation: A Case Study of Testing an Automated Modelling Tool, Special Issue on Automation of Software Test, the Computer Journal, (In press). • Shan, L. and Zhu, H., Testing Software Modelling Tools Using Data Mutation, Proc. of AST’06, ACM Press, 2006, pp43-49. • Zhu, H. and Shan, L., Caste-Centric Modelling of Multi-Agent Systems: The CAMLE Modelling Language and Automated Tools, in Beydeda, S. and Gruhn, V. (eds) Model-driven Software Development, Research and Practice in Software Engineering, Vol. II, Springer, 2005, pp57-89. • Liang Kong, Hong Zhu and Bin Zhou, Automated Testing EJB Components Based on Algebraic Specifications, Proc. of TEST’07, IEEE CS Press, 2007. Data Mutation Software Testing

More Related