1 / 47

Analysis: Identifying Parts, Organization, and Storage

This analysis focuses on identifying and understanding the parts of a system, how it is organized, and how it is stored. Key elements include traceability, verification, formal and informal methods, and reuse domains. The analysis also includes validation and user/stakeholder involvement through informal methods, prototyping, and inspections.

garymgarcia
Download Presentation

Analysis: Identifying Parts, Organization, and Storage

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Analysis I

  2. Analysis • Identify the parts • How is it organized? • How is it stored? • traceability • Verification • formal • reuse domains • inspections

  3. Analysis • Validation • Close to the users/stakeholders • informal • prototyping (mock up, storyboard)

  4. Analysis do Carry out use Carry out Validation use use Identify the parts People Verification dependes on Tools Methods Points of View

  5. Verification Are we building the thing right? (compared to other products) among models Validation Are we building the right thing? (regarding stakeholders/users desire) Between the UofD and a Model Verification vs Validation

  6. Analysis Loop UofD Facts gathering communication * model ** yes problems? problems communication No * modeling ** identifying the parts model

  7. Identification of the parts • Depends on the models are organized and stored • Linked to modelling and elicitation • 90% of the problems in 10% do system

  8. REQUIREMENTS UofD a e c j g b d f i h problem Req. document Version 1 Requirements Documentation definition Req. document Version 2 Req. document Version 3 design software implementation maintenance

  9. Validation • Are we building the right product? • We have to compare the UofD with users/stakeholders expectations • Run Scenarios (Reading them in meetings) • Prototype

  10. Validation Strategies • Informal corroboration • storyboards • prototypes

  11. Validating through scenarios use • As many times as possible • The earliest the better • If possible validate the candidate scenarios list • Scenario Validation goal: elaborate the DEO list( Discrepancies, errors and omissions) • Users’ commitment is essential

  12. Main stream • Validate scenarios with users using structured interviews • Strategies: • Read scenarios aloud together with users • Ask Why

  13. Validation Through Scenarios • Gradual confirmation of scenarios parts (objective, actors, resources) • Feedback for LEL • Tag scenarios where doubts arise • Make notes of discrepancies, errors or omissions

  14. Storyboard [Leffingwell & Widrig] • Elicit reaction such as “Yes, but…” • Passive, Active or iterative • Identify actors, explain what happens to them and describe how it happens • More effective to projects with innovative or unknown content

  15. Storyboard • Pros: • cheap • User friendly, informal and iterative • Allow to criticize system interface early in the project • Easy to create and modify

  16. Types of storyboard • Passive • Static screens • Business rules • Reports • Active • Presentation (As in PowerPoint) • animation • simulation • Iterative • demo ( free browsing) • Iterative presentation

  17. Storyboard passive active iterative presentation screen prototype animation demo Business Rules simulation Iterative Presentation reports Complexity and cost

  18. Prototype • Prototypes are partial implementation to help stakeholders, users and developers to better understand system requirements

  19. Prototypes • Also helps to elicit reactions such as “ Yes, but…” • Help to clarify fuzzy requirements • Requirements that are known but not well defined or not well understood • Help elicit reactions such as “Now that I can see it working it comes to me that I also need…..” • Availability of tools that help to build fast and cheap prototypes

  20. Types of Prototypes [Davis] • Throw away • It has to work • Use any means to implement the desired result (it does not care for quality code) • Once the requirements are elicited the prototype is deleted • Evolving • Implemented using the same architecture being used in the system • The system may be an evolution of this prototype

  21. Prototype Vertical X Horizontal • Horizontal • Implements a large portion of the functionality • Vertical • Implement a few functions • Better quality

  22. Verification • Are we building the product correctly ? • Use of Models • representations/languages • Use of formalisms • Informal Techniques

  23. Use of formalisms • Formal Proofing of a model • Theorem proofing • Detection of discrepancies between the model and the meta models • Model Proofing

  24. Informal Techniques • Walk through • Inspections

  25. Walk through • ad hoc preparation • Meeting (author(s), evaluator(s), secretary) • Reading • author reads • Evaluators hear • Evaluators point out problems (questions) • Secretaries write down problems • List of problems

  26. ... Isnpection Meeting Preparation List of problems Processes For Meetings Reading Inspection Protocol Inspections

  27. Inspections • Process • planning • Global Vision • preparation • inspection (meeting) • re-work (corrections) • Follow-up

  28. Inspections • Create in 1972 by Fagan, at IBM, to improve quality of code • Currently they are used to check any type of artefact used in the software development process • Inspection can detect betwee 305 and 90% of existing errors • Reading techinique applied to an artefact aiming at detecting errors in the artefact according to a pre-stablished criterea

  29. Inspections in Requirements Planning Follow-up Detection Global View Colection Fix • Organizer • Moderator • Inspector • Autgor • Secretary Process • to be inspected • To carry the inspection roles Inspection Artefatos Reading Techniques • Based on Perspectives • Ad hoc • Check lists • Function point Laitenberger01

  30. Inspections • Help to find errors before we move to the next phase • What information should be checked • How to identify defects in the chosen models • Techiniques for reading a ERD • Ad hoc (based on personal experience) • Checklist (list of itens to be checked) • Perspective-based reading (Goood for req in Natural Language) • Function Points based (experimental)

  31. Inspections - steps • Planning: Choose participants; schedule the meeting; generate and distribute material to be used • General View:Author presents artefacts to be inspected by participants • Inspection: inspectors evaluate the artifact and document defects found. • Colection: defects are summarized and communicated to the author • Correction: defects are fixed • Follow-Up: check the corrections made

  32. Inspections - roles • organizer: responsable for the organizing the whole process • author : presents a global view of the artefact before the inspection begins • inspector: analyse the artefacts following a pre-defined reading techinique anotating all the defects found • secretaryo: document the inspection. Collects defects foudn by inspectors and consolidate them into one document • moderator: reponsable for conducting the meeting and manage possible conflicts

  33. Inspections in requirements • based on check lists: • Inspectors use a list with the itens to be checked • Each artefact has an specific list (req. Document, USe cases, Lexicon, Scenario, DfD, Class diagram ...) • Defects are anotated in the artefact being analysed • After reviewing, a meeting is carrried out to communicate the problems found to developers • Defects that can be found: • Incorrect sintax in the artefacts(Definition of a term, measrument units ...) • Incosistent information among artefacts (ex: Use cases and Glossary) • NFRs not explicited • Actors or resources incomplet or in excess • No Pre-conditions (Use Case and Scenarios) • No exceptions in scenarios

  34. DFD • Checklist DFD • The documentation should contain: • Date, numbered pages, list of topics, change and version control • Process represented by a numbered circle • Identifier should begin with a verb • Maximum number of processes should be 7 +- 2

  35. OO • Checklist OO: • Are all classes represented using rectangles with 1, 2 or 3 compartments? • Are there two classes with the same name? • Are there classes without defined relationships? • Are the attributes and methods for each class adequate?

  36. N-Fold Inspection • Many teams • Each one carries out an independent inspection process • Compare results • Final Report

  37. User Moderator Leaders Team 3 Team 2 Team 1 Each document is revised by n teams where each team uses the inspection process to find errors Figura N-fold

  38. Parallel is better • Multiple inspection teams find more defects than one single bigger team • The teams tend to find sub sets of different defects • The combination of the various results from the different teams tends to sum not to be redundant

  39. Example – ?? HP ?? • Roles: • inspectors: • Items – errors, omissions, inconsistencies, confusion • Related subjects: items that go beyond the scope of the inspection • Owner of the document • Identify (agrees) • Correct the errors • secretary • Annotate items and subjects • Before and during the meeting

  40. Example - HP • Main Moderator • Owner of the inspection process • Collects and communicate statistics • Points to correction in the inspection process • Focal point for changes on inspection standards and checklist • moderator • Manages the process • Facilitates inspection • Statistics • Send status and subjects to management

  41. HP - Process • Planning • plan inspection • moderator and owner • kick off • Quickly update the team about the document and the inspection process • All member involved

  42. HP - Process • Preparation • Identify items and subjects • All members participate • Meeting • Identify items and subjects • Report items and subjects • All members participate • Cause/Prevention • brainstorm causes • recommend solutions • All members participate

  43. HP - Process • Re-work • Verify and point errors or defects • Owner and moderator • Follow up • Release the document • Owner and moderator Key lessons in achieving widespread inspection use - Grady & Slack - IEEE Software, July1994, pp.46-57

  44. Challenges from inspections • Big Requirements Document • Informal and incremental revisions during the development of specification • Each inspector starts from a different point • Divide into many small teams – each inspects a specific part of the document

  45. Challenges from inspections • Large inspection teams • Difficult to schedule meetings • Parallel conversation • Difficult to get an agreement • What to do? • Be sure the participants are thee to inspect and not to “spy” the specification or to keep a political status

  46. Challenges from inspections • Large inspection teams • Understand which point of view (client, user, developer) the inspector is using and keep only one to each interested part • Establish many small teams and carry out the inspection in parallel. Combine the lists and remove redundancies.

  47. Challenges from inspections • Geographical distance between inspectors • videoconference, teleconference, e-mail, web • Difficult to observe corporal language and expressions, • Difficult to moderate • 25% reduction on the effectiveness • [Wiegers98] - The seven deadly sins of software reviews - Software Development -6(3) pp.44-47

More Related