1 / 13

Model Interchange Testing: a Process and a Case Study

Model Interchange Testing: a Process and a Case Study. IBM Canada Ltd. Carleton University Ottawa, Canada. Maged Elaasar , Yvan Labiche. ECMFA 2012, Copenhagen, Denmark. Motivation. Modeling tool A. Modeling tool B. Model interchange. import. export. Issues due to:

elmo
Download Presentation

Model Interchange Testing: a Process and a Case Study

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Model Interchange Testing: a Process and a Case Study IBM Canada Ltd. Carleton University Ottawa, Canada Maged Elaasar, Yvan Labiche ECMFA 2012, Copenhagen, Denmark

  2. Motivation Modeling tool A Modeling tool B Model interchange import export • Issues due to: • Ambiguities in modeling standards • Ambiguities in interchange standard • Lack of verification of tool interchange capabilities

  3. Outcome OMG members (tool vendors and users) formed a Model Interchange Working Group (MIWG) Objective: to test and improve model interchange between tools. This presentation: report on the activities of the MIWG a verification testing process case study: interchange of UML and SysML models

  4. Verification testing process • Process: defining and executing model interchange test cases • Test case: testing an “area” of a modeling language. • Large area: e.g., UML Sequence Diagrams • Small area: e.g., specific types of Actions in UML Activity Diagrams • Execution: • defining a reference model • exporting it from one tool • importing the result into another tool.

  5. Verification testing process (cont.) Test oracle • MIWG: • Defines test cases: • To cover areas of modeling language that are deemed important tools should be able to interchange • Experts a small exemplary reference model (image) • Experts creates corresponding XMI file • Tools generated + editing • Producer (each one, for each test case): • Manually re-creates model (from reference model image) • Exports the model as XMI • Exports diagram(s) as image(s) • Compares exported XMI with reference one • Implementer: resolves issues • MIWG: e.g., issue when specifying and creating a test case • Revision Task Force: e.g., identified ambiguity in standard • Producer: e.g., tool does not support modeling feature, difference between exported model and reference model • Consumer: e.g., tool does not import specific feature, difference between re-created diagram(s) and reference model • Consumer (each other one, for each produced XMI): • Imports XMI model • Manually re-creates diagram(s) • Exports diagram(s) as image(s) for comparison purposes • Exports model as XMI for comparison purposes (optional) • 4 roles: can be played by one or more parties • MIWG: defines test cases • Producer: creates and exports model • Consumer: imports and compares model • Implementer: resolves issues

  6. Issue of scalability Assuming N tools T test case specifications (i.e., reference models) The process involves N exports for each of the T test cases, followed by N-1 imports for each export. Linear scalability of [T.N] on export, Polynomial scalability of [T.N.(N-1)] on import Import is partly manual: re-creating diagram(s) Plus: Standards evolve Tools evolve Test suite can be revised, extended Hinders scalability

  7. Verification testing process (revised) • The MIWG agreed that: • validating the exported models by comparing them to the reference models • testing the import of the reference models • should be sufficient. • Automated XMI file validation (compliance to standards) and comparisons with reference model

  8. Case Study Modeling languages selected: UML, SysML Market pressure, popularity Test suite: 16 test cases (3/4 for UML) 59% of UML metaclasses 55% of SysML stereotypes Six tools

  9. Case Study Execution 30 months 1st phase (initial process): 21 months 96 (16x6) exports and 480 (16x6x5) imports Re-exports/re-imports necessary as standards, test cases and/or tools were being revised 2nd phase (revised process): 9 months 192 (16x2x6) imports

  10. Case Study--Results 1st phase helped uncover major issues tools’ support of the UML metamodel and SysML profile which hindered the successful interchange of models. Showed tools export extra information, or non-standard information Not always expected during import MIWG proposed that tools use the XMI:exporter tag to specify their tool name during export Such that import can be customized Showed tools do not offer consistent support for standards E.g., default values for multiplicity of UML typed elements

  11. Case Study--Results (cont.) 2nd phase issues/bugs reported by tools for each test case for their first (dashed line) export for their last (solid line) export (test case exports 3-4 times on average due to bug fixes) Overall improvement Some remaining issues mainly due to ambiguities in standards

  12. Conclusions MIWG has defined and validated a rigorous incremental model interchange testing process Process used in a case study to assess UML and SysML model interchange between six tools Tools’ conformance to the standards increased by 20% Extending test suite to remaining parts of UML metamodel and SysML profile Applying process to other modeling languages

  13. Questions?

More Related