1 / 15

A REVIEW OF COMPONENT INTERACTION APPROACHES FROM THE TESTING PERSPECTIVE

A REVIEW OF COMPONENT INTERACTION APPROACHES FROM THE TESTING PERSPECTIVE. SV04: THIRD WORKSHOP “SYSTEMS TESTING & VALIDATION”. Angelina Espinoza Juan Garbajosa. http://syst.eui.up.es. Contents. Introduction and motivation Some Approaches: The Jorgensen and Erickson view

Download Presentation

A REVIEW OF COMPONENT INTERACTION APPROACHES FROM THE TESTING PERSPECTIVE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A REVIEW OF COMPONENT INTERACTION APPROACHES FROM THE TESTING PERSPECTIVE SV04: THIRD WORKSHOP “SYSTEMS TESTING & VALIDATION” Angelina Espinoza Juan Garbajosa http://syst.eui.up.es

  2. Contents • Introduction and motivation • Some Approaches: • The Jorgensen and Erickson view • The Jin and Offutt view • The Liu and Dasiewicz view • The Williams and Probert view • Conclusions SV04: THIRD WORKSHOP ON SYSTEM TESTING AND VALIDATION

  3. Introduction and motivation • Complex systems can be made of heterogeneous components • Hardware, software or both . • Component interactions, mainly those unexpected, are a source of conflicts. • It is difficult to determine which test cases to run to ensure the components will interact properly. • Systems faults are often faults coming out from non-evident interactions. • This work does not intend to be exhaustive at the moment SV04: THIRD WORKSHOP ON SYSTEM TESTING AND VALIDATION

  4. Issues • Modelling of components in order to get a better understanding of the “fact” of interaction • Approaches to test applications in the presence of application non-evidentinteractions • Measurement of test coverage in the presence of component non-evidentinteraction • Component interaction observation for integration testing • Pure monitoring/visualization of component interaction SV04: THIRD WORKSHOP ON SYSTEM TESTING AND VALIDATION

  5. SYSTEM Computer Computer Introduction… • Even more, is common that the risk is magnified when, for each element in a system, there are a number of interchangeable components. • Then, one of the main concerns for system reliability and predictability is precisely this component interactions. SV04: THIRD WORKSHOP ON SYSTEM TESTING AND VALIDATION

  6. Some approaches:The Jorgensen and Erickson view • “In most developing methodologies the system structure is the goal, not the behavior” • The even-driven nature of OO systems forces a “declarative spirit” (as opposed to imperative), no-order on testing • Event-driven • Dynamic binding • Composition: which is the set of adjacent objects? • Threads • So, the appropriate construct for the integration level, should be compatible with: • Composition • Avoid the inappropriate structure-based goals • Support the declarative aspect of object integration • Be clearly distinct from unit and system level constructs. SV04: THIRD WORKSHOP ON SYSTEM TESTING AND VALIDATION

  7. First, define five distinct levels of OO testing: Method Message quiescence Event quiescence Thread testing Thread interaction The Jorgensen and Erickson View… Definition: • A Method/Message (MM-Path) is a sequence of method executions linked by messages. • An MM-Path starts with a method and ends when it reaches a method which does not issue any messages of its own (message quiescence). SV04: THIRD WORKSHOP ON SYSTEM TESTING AND VALIDATION

  8. The Jin and Offutt view • Coupling: • Couplings are exactly where faults (found during integration testing) occur. • Increases the interconnections between two units and the likelihood that a fault in one unit may affect others. Presents an integration testing technique based on couplings between software components. Coupling types are (initially 12 types): • Parameter Coupling: Refers to all parameter passing. • Shared Data Coupling: Refers to procedures that refer to the same objects. • External Device Coupling: Refers to procedures that both access the same external medium. SV04: THIRD WORKSHOP ON SYSTEM TESTING AND VALIDATION

  9. NODE Unit A Unit B a( int x) { - putc(x); - - } b( void) { - x = 3; - a(x) - } DEFINITION USE The Jin and Offutt view… Control Flow Graph: directed graph; represents program structure Coupling-Def: It is a node that contains a definition that can reach a use in another unit on at least one execution path. Coupling-Use: It is a node that contains a use that can be reached by a definition in another unit on at least one execution path. SV04: THIRD WORKSHOP ON SYSTEM TESTING AND VALIDATION

  10. The Liu and Dasiewicz View It views interactions as exchanges of messages between concurrently executing objects (asynchronousness is left aside!) Interaction Errors: Since a “protocol” describes what order things are expected to happen, then the interaction problems can be described as “violations of the correct protocol”. The approach evolved to considered UML and with some add-ons Component Interaction Testing (CIT) Method • Requires a model (e.g finite state machines )of the component function. • Requires a list of problematic sequences of interactions, called test requirements, used to select the test cases that will exercise the interactions. SV04: THIRD WORKSHOP ON SYSTEM TESTING AND VALIDATION

  11. The Liu and Dasiewicz View… PBX Call-forward Example Model for “setforward” Test requirements for “setforward” TEST CASES: { startfeaturedial, featuredigit(5), morefeaturedigits, callterminate } { offhook, dialtone, digit(‘#’), idletone, digit(5), onhook } SV04: THIRD WORKSHOP ON SYSTEM TESTING AND VALIDATION

  12. Calling phone: Regular (1) Wireless (2) Public (3) Call Type: Local (1) Charged (2) Toll free (3) Switch Market: Canada (1) US (2) Mexico (3) Called phone: Regular (1) Wireless (2) Public (3) The Williams and Probert View • The problem that a system tester faces is the thoroughness of test configuration coverage, versus limited resources of time and expense. • An unwanted interaction is usually not caused by the particular values of the entire set of parameters, but by the values of only a (hopefully, small) subset of parameters. • The aim is to reduce the number of test configurations so the tests can be conducted with a feasible cost in time and money, and still have a good probability of detecting unwanted system interactions A system test scenario SV04: THIRD WORKSHOP ON SYSTEM TESTING AND VALIDATION

  13. The Williams and Probert View • The approach is based on the concept ofinteraction element, which consists in selecting a subset of parameters, and a number of specific values assigned to these parameters, similarly to pair wise covering approach • The objective is that every interaction element be covered by a selected test configuration. • Interaction elements are used as test units for system interaction testing, while other use control flow branches or definition-use associations in other type of test coverage criteria. SV04: THIRD WORKSHOP ON SYSTEM TESTING AND VALIDATION

  14. Conclusions • Summarizing existing models focus considerably on structural properties as opposed to behavioural • Event nature of software systems is somehow forgotten. Maybe declarative specifications, as opposed to imperative, are required • Synchronous view as opposed asynchronous is the basically considered • Finally each author proposes tests coverage approaches metrics according to their approaches. But together with the efficacy of each metric not forget that the approaches analyzed, mostly, are affected by combinatorialexplosions, when applied to big systems. SV04: THIRD WORKSHOP ON SYSTEM TESTING AND VALIDATION

  15. Thank you for your attention! SV04: THIRD WORKSHOP ON SYSTEM TESTING AND VALIDATION

More Related