1 / 22

Object-Oriented Systems DIF 8901

Object-Oriented Systems DIF 8901. Presentation of two papers: »On the purpose of Object-Oriented Analysis« »Object-Oriented Integration Testing« Sven Ziemer, 5 February 2003. On the purpose of Object-oriented Analysis.

audra
Download Presentation

Object-Oriented Systems DIF 8901

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Object-Oriented SystemsDIF 8901 Presentation of two papers: »On the purpose of Object-Oriented Analysis« »Object-Oriented Integration Testing« Sven Ziemer, 5 February 2003

  2. On the purpose of Object-oriented Analysis • The paper discusses the general purpose of analysis and evaluates OOA with respect to this purpose, arguing that OOA often does not deliver what it claims to do. • Outline • Introduction • The purposes of analysis and design • OOA and the purpose of analysis • OOA and the transition to design • Consequences for the NSR project • Positive trends in OOA • Conclusions

  3. Introduction (1) • The background of this work emerged in the NSR (Norsk System Rammeverk) project • Goal: define a common methodology framework in which partner methods can serve as components. • Reason to evaluate OOA methods: • Need for early lifecycle support • Existence of object-oriented approaches and a positive attitude • Existence of both real-world modeling tools and software specification tools • General goal of total lifecycle coverage and smooth transition between phases.

  4. Introduction (2) • Difficult to define exactly what OOA is. • Two general claims of OOA: • OOA fulfills the purpose of analysis • OOA has a smooth transition to design • The authors found these claims to be false. OOA turns out to have several shortcomings and does not present more than a partial solution.

  5. The purpose of analysis and design (1) • Analysis activities concentrate on requirements. • The exact boundary between analysis and design is hard to determine. But there is a different purpose: • Analysis concerns the desription of the problem and the user requirements, whereas design concerns the construction of a solution which satisfies the previously recorded requirements.

  6. The purpose of analysis and design (2) • Details of the purpose of analysis: • Knowledge to be captured • Analysis should model aspects of the real world which are relevant to the problem (objectives, application domain knowledge, requirements on the environment and requirements on the computer system). • Activities to be undertaken • Quality assurance (QA): verification and validation • Requirements towards languages and methods • The modelling languages should primarily be suiteable for describing real world problems (Probelm orientation). • Transition to design • The burden of translation from the problem domain to the system domain should be placed on software engineers.

  7. OOA and the purpose of analysis (1) • OOA and the software life-cycle • Many OOA methods do not seem to fit with the traditional meaning of analysis. Many methods assume that the requirements have been established before analysis starts. • For most OOA/OOD approaches, the difference between analysis and design is simply the difference between what and how. • The real difference should be whether it addresses user requirements or solution.

  8. OOA and the purpose of analysis (2) • Problem-orientation • May be closer to preliminary design than analysis in the traditional sense (OOA assumes the requirements as a starting point). • The result of analysis is intended for negotiation with the customer. • An OO representation might be good for some kind of knowledge, but less suitable for oter kinds of knowledge: • Business rules • Dynamics, e.g. processes

  9. OOA and the purpose of analysis (3) • Verification and validation • Most languages and methods for OOA are informal. • Verification is poorly supported. • Useful validation techniques are not supported.

  10. OOA and the transition to design • There are no miracles ̶ transition to design must transform analysis objects into design objects. • Since the purpose of OOA is to reflect on the problems, there is no directly map into some unknown information system. • Example: The OOPSLA conference Registration Problem.

  11. Consequences for the NSR project • To archive the goal of being problem-oriented, the NSR project performed some work on: • Roles ̶ Developed by TASKON. An object abstraction, describing a particular purpose or viewpoint on the object. • Data flow Diagram ̶ Developed by METIS. Adds a process model similar to DFD’s to represent work processes.

  12. Positive trends in OOA • OOA has evolved into putting focus on system dynamics. • Some methods have recognized the difference between analysis and design. • Formal approaches have started to apear.

  13. Object-Oriented Integration Testing • The paper gives an overview of software testing and presents constructs for object-oriented integration testing. • Outline • Integration Testing • Constructs for Object-Oriented Integration Testing • Observations

  14. Integration Testing (1) • Most of the popular notations used in software development portray software structure. • This information is needed by developers, but is only moderately useful to testers. • Software testing I fundamentally concerned with behavior. The object-oriented testing constructs introduced are deliberately behavioral.

  15. Integration Testing (2) • Different structure of software • Traditional software is • Written in an imperative language • Described by a functional decomposition • Developed in a waterfall life cycle • Separated into three levels of testing. • This does often not apply directly to object-oriented software, and represents latent assumptions which must be revisited.

  16. Integration testing (3) • Imperative vs. declarative languages • Imperative languages: • The order of source statements determines the execution order • Lend themselves to a rigorous description as directed graph or program graph. This helps the tester give a more accurate description of what is being tested. • Declarative languages • Suppresses sequentiality. • The event driven nature of object-oriented systems forces a »declarative« spirit on testing.

  17. Integration testing (4) • Decomposition vs. composition • Functional decomposition • Has been the mainstay of software development since the 1950’s. • Occurs in a waterfall development life cycle • Deep implications for testing • Emphasizes levels of abstraction • Creates questions of integration order • Stresses structure over behavior. • Composition • Occurs in a non-waterfall development life cycle • Does not focus natural on structural testing order.

  18. Integration testing (5) • Different levels of testing • The waterfall model of software development is sometimes depicted as a »V« in which the development phases are at levels corresponding to • Unit testing: Different definitions of units (i.e. single function) The goal is to verify that the unit functions correctly. • Integration testing: corresponds to preliminary design in the waterfall model. Some goals of integration testing is pair wise integration, bottom-up integration and top-down integration. • System testing: Conducted exclusively in terms of inputs and outputs that are visible at the port boundary of a system. • The preference of structure over behavior as the goal for integration testing will be recognized as yet another shortcoming of the waterfall model.

  19. Integration testing (6) • Testing of object-oriented software • Different levels of testing • Unit testing: Object methods are units • System testing: Thread based testing

  20. Constructs for Object-Oriented Integration Testing (1) • The implications of traditional testing for object-oriented integration testing require an appropriate construct for the integration level. • Five distinct levels of object-oriented testing: • Method (unit testing) • Message quiescence • Event quiescence • Thread testing (system testing) • Thread interaction testing (system testing)

  21. Constructs for Object-Oriented Integration Testing (2) • Method/Message Path • A Method/Message Path (MM-Path) is a sequence of methods executions linked by messages. • Atomic System Function • An Atomic System Function (ASF) is an input port event, followed by a set of MM-Paths, and terminated by an output prot event.

  22. Observations (1) • The new constructs result in an unified view of object-oriented testing, with fairly seamless transitions across the five levels. • Integration is grounded in behavioral rather than structural considerations. • These constructs have avoided the problem of structurally possible and behaviorally impossible paths.

More Related