1 / 39

Software Inspections of Requirements Specifications

Software Inspections of Requirements Specifications. Smita Chaganty Brandon Vega. Overview. What are inspections? Reviews and walkthroughs Inspection teams Steps to Formal Inspection Detection methods Usage Based Reading (UBR) UBR Vs CBR (Checklist Based Reading)

adah
Download Presentation

Software Inspections of Requirements Specifications

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Inspections of Requirements Specifications Smita Chaganty Brandon Vega

  2. Overview • What are inspections? • Reviews and walkthroughs • Inspection teams • Steps to Formal Inspection • Detection methods • Usage Based Reading (UBR) • UBR Vs CBR (Checklist Based Reading) • Defects in Use Case models • Study & results

  3. Importance of Requirements • Software Requirements Specifications (SRS) used as input to Planning, Estimating, Design, and Testing • “contract” between customers and developers • “Use-case Driven” • Quality of requirements documents is important

  4. What are Inspections? - 1 • A means for detecting defects and violation of development standards • Improve quality in software documents • Team of reviewers reads SRS, and identify as many defects as possible • Defects are sent to the document’s author for repair

  5. What are Inspections? - 2 • Objectives of an inspection: • verify that the software element(s) satisfy its specifications, • verify that the software element(s) conform to applicable standards, • identify deviation from standards and specifications, and • collect software engineering data (such as defect and effort data).

  6. Reviews and Walkthroughs - 1 • REVIEWS • Manual process • Multiple readers • Checks for anomalies and omissions • Representatives of stakeholders should participate in a review

  7. Reviews and Walkthroughs - 2 • Walkthrough • Peer group review • Several people involved • Typically a walkthrough involves at least one person (usually the author) • Reviews for consensus and walkthroughs for training

  8. Inspection team consists of… • Moderator – leads inspections, schedules meetings, controls meetings….. • Author – creates or maintains the product being inspected • Reader - describes the sections of work product to the team as they proceed • Recorder – classifies and records defects and issues raised • Inspector – attempts to find errors in the product

  9. Formal inspection consists of… • Planning • Overview meeting • Preparation • Inspection meeting • Casual analysis • Rework • Follow-up

  10. Detection Methods - 1 • A set of procedures coupled with the assignment of responsibilities to individual reviewers • Methods: • Ad Hoc • Checklist • Scenario-based • Formal proofs of correctness (e.g.. Z)

  11. Detection Methods - 2 • Ad Hoc • Informal, non-systematic detection technique • No explicit assignment of reviewer responsibility • Relies on knowledge and experience of inspector • Checklist • Most popular method • Reuse of “lessons learned” • Defines reviewer responsibilities • Suggests ways for reviewers to identify defects

  12. Detection Methods - 3 • Checklist (continued) • Reviewers answer a list of questions based on knowledge from previous inspections / ad hoc technique no guidance provided

  13. Checklist Example- 1 • Does the development team understand the Use Cases that this design supports? • Can every member of the development team trace a Use Case Scenario through the designs using a small number of high level components? • Are the bulk of the Class names understandable and recognizable by the domain experts? • Do the Class names reflect the overall responsibility of the Class with respect to the Use Cases and the design? • Does the name of every Message reflect the intended outcome of the Method? (There should be no actions that could not be inferred from the Message name.) • Does the purpose of the Method match the overall responsibility of the Class that it is in? (Should the method be moved to another class, or does it belong in a different place in the inheritance hierarchy?) • Have the Class diagrams been drawn to emphasize the Classes used by a set of Use Cases? • Have Packages been used to group related Classes, and are dependencies between packages noted? • Are all parameters specified for every message on the interaction diagrams, and is there a way for the sending object to know the parameters it is passing?

  14. Checklist Example- 2 • Is flow of control obvious and correct within the interaction diagrams? (Hyperspace leaps are not allowed.) • Do Object names on Interaction Diagrams conform to coding guidelines? • Do Parameter names in Messages conform to coding guidelines? • For cases when there are more than one message to the same object, could this set of messages be replaced with a single message? • Is the design supposed to conform to the Law Of Demeter? (An object can only send Messages to Itself, It’s Attributes and passed parameters) • Does each Method have a set of Unit Tests defined? • Are all of the Failure Conditions from each Use Case tested for? • Does the implementation of every Method on a Class use at least one attribute or method in that class? (If a Method does not refer to it’s object, then it is just a utility function attached to the class, so it should be moved elsewhere.) • Overall, is the an even balance of responsibilities between the classes? (No Workaholics allowed) • Do the bulk of the Classes have an even balance of methods and attributes? (Trivial accessor methods do not count.) • For Messages with similar groups of parameters, can a simple data holder class be defined to simplify the message?

  15. Checklist Example- 3 • Whenever Inheritance is used, is the phrase "Sub-class is a kind of Super-class" correct for the life of the object? • Does Interface Inheritance make more sense than Implementation Inheritance? • Even where inheritance is not applicable, are consistent names used for similar Methods? • Are all Attributes private? • Is the visibility of Methods appropriate? (If a Method is not invoked from outside the class it should be private.) • Are there any unused Attributes or Methods on a Class? (Unused implies untested which implies code defects.) • Is Run Time Type Information used to switch behavior? (This is only needed for interaction with non-object systems.) • Are the Classes partitioned so that as many Classes as possible are platform independent? • Is the persistence and transaction policy defined? • For containers, is ownership of the contained objects defined? (If the container is deleted, are the contents also deleted?) • Has the design been tested by the development of an Executable Architecture?

  16. Detection Methods – 4 • Ad Hoc and Checklist Methods: • Non-systematic approaches • Reviewers responsibilities - both general and identical

  17. Detection Methods – 5 • Scenario • Defect specific procedures • Used to detect particular classes of defects • Each reviewer executes a single scenario, therefore, • Multiple reviewers are needed to achieve broad coverage of the document • Each review has a distinct responsibility

  18. Results of an experiment using different detection methods • Scenario approach increased defect detection rate as compared to Ad Hoc and Checklist • Performance of reviewers: • reviewers using Scenarios found more defects than those who used Ad Hoc or Checklist • Ad Hoc and Checklist techniques were equivalent

  19. Inspections of requirements specification • Problems with ad hoc and checklist techniques – use scenario-based • Why? – checklist is used as starting point for more elaborate technique • Scenario based technique • teaches the instructors how to read requirements for defect detection • Offers a strategy for decomposition

  20. Usage Based Reading (UBR) • Assumes that UC and scenarios have been defined earlier in the development process • Utilizes UC to focus the inspection effort • Steps: • Prioritize the UC in order of importance • Select the UC with the highest priority • Track the UC scenarios through the document under inspection • Ensure the document fulfills the UC goal • Select next use case and repeat from step 3

  21. Usage Based Reading (UBR) vs. Checklist Based Reading (CBR) • Study: Compare the effectiveness( # of defects) and efficiency (time to find defects) • 23 CIS graduate students, many with software engineering experience • 11 used UBR, 12 used CBR • Taxi fleet management system • Defects were ranked as A (Crucial), B (Important), or C (Unimportant)

  22. Usage Based Reading (UBR) vs. Checklist Based Reading (CBR) • Results: UBR is significantly more efficient and effective than CBR • UBR finds more faults per time unit for crucial and important faults • UBR finds a larger share of faults • UBR reviewers spent an average of 6.5 minutes less in preparation and 4 minutes less in inspection • UBR reviewers found twice as many crucial faults per hour as CBR reviewers • UBR reviewers identified average of 21% more faults than CBR reviewers • CBR discovered 63% more unimportant faults • Which means…CBR wastes effort searching for issues

  23. Defects in Use Case Models • To create an inspection technique for UC models, knowledge of typical defects and their consequences is needed

  24. Defects in Use Case Models • We must consider different stakeholders to find a comprehensive list of defects • Stakeholder: Clients and End Users • Clients and end users want to be sure they get the expected functionality • The correct actors should be identified and described • The correct UC should be identified and should describe how use case goals are reached • The actors should be associated with the correct UC • The flow of events in each UC should be realistic, easy to understand, and described in an appropriate level of detail • Functionality should be shown through use of pre- and post-conditions

  25. Defects in Use Case Models • Stakeholder: Project Manager • Managers need to plan projects • To support the planning: • UC model should cover all functional requirements • All interaction between actor and system that are relevant to the user should be described

  26. Defects in Use Case Models • Stakeholder: Designers • Designers will apply UC models to produce an OO design, therefore: • Terminology should be consistent throughout the UC descriptions • UC descriptions should be described at a suitable level of detail

  27. Defects in Use Case Models • Stakeholder: Testers • Testers apply UC models to test that the functionality is correctly implemented, therefore: • Pre-conditions and post-conditions for the execution of a UC should be testable, and • All terms in UC descriptions should be testable

  28. Defects in Use Case Models • UC can be described in many different formats • The actual format may have an impact on defect detection • Inspection techniques must be tailored to the actual format used

  29. Study 1 • No inspection technique exists that is specific to UC models • Studies: Anda & Sjoberg - Create a checklist based inspection technique for UC models • Students organized into teams of clients and developers • Each team clients for one system and developers for the other system • Client teams created informal textual requirements specifications • Developer teams constructed UC models • Fall 2000: Client teams evaluated UC models and found very few defects despite the presence of many defects • Fall 2001: Client and development teams evaluated UC models using the checklist method

  30. Study 1 Results • With checklist almost all teams found defects and suggested corrections • Very few defects were missed • The clients found twice as many defects as developers • Very few common defects were found between clients and developers...Why? • There is a large difference between what is considered a defect in a UC model • Difference in defects found suggests a technique based on different perspectives may be useful • May also be useful to involve different stakeholders in the inspections

  31. Study 2 • Fall 2001: 45 students received textual requirements for a hospital roster management system • Defects were inserted into the UC by authors • Half used checklist, half used ad hoc • Inspections performed individually

  32. Study 2 Results • Checklist group found slightly more defects regarding the actors in the UC • Ad hoc group found more defects in the other categories • Overall the difference in detected defects was negligible • Checklist method found to be more time consuming • Many defects relating to flow of events not found by either group indicating that these defects are difficult to detect • Conclusion: checklist may not be useful when inspectors have good knowledge of defects they are expected to find (the students had recently performed similar inspections) and… • Experienced inspectors may be more efficient without a checklist

  33. Study 3 • Porter & Vatta 1994 • Hypothesized that systematic approaches such as scenario based will increase the overall effectiveness of an inspection • Remember, scenarios target a specific class of defects • Results: the scenario detection method had highest defect detection rate followed by ad hoc and checklist (keep in mind, the checklist is the industry standard) • Conclusion: reviewers are able to focus on the specific class of defects which facilitates a higher rate of defect detection

  34. Study 4 • Lanubile & Visaggio replicated the previous study • Hypothesis: Scenario based methods would result in more defect detection • Conclusions: The team defect detection rate when using the Scenario technique was not significantly different from those obtained with Ad Hoc or Checklist techniques • Why?: • Subjects were asked to learn too many new things • Defects in the introductory parts create confusion • Training was unfair • Subjects who had trouble with the Scenario approach used different techniques to execute the task. • Time limit was too short

  35. Conclusions… • The quality of requirements specification is important for the quality of the resulting product. • Different methods can be used for software inspections, however the best are the ones that are systematic and are inspected by experienced inspectors. • Poorly designed inspection method can lead to poor inspection performance • UBR is significantly more efficient and effective that CBR • To create an inspection technique for UC models, knowledge of typical defects and their consequences is needed.

  36. References • Towards Inspection Technique for use Case Models, Bente Anda & Dag I. K Sjoberg http://portal.acm.org • Software Inspection of Requirements Specification Documents: A Pilot Study, Tereza G. Kirner & Janaina C. Abib http://portal.acm.org • An Experiment to Assess Different Defect Detection methods for Software Requirements, A. A. Porter & L. G Votta www.acm.org/pubs/citations/proceedings/sotf/257734/p103-porter/ • Prioritized Use Cases as a vehicle for Software inspections, Thomas Thelin & Per Runeson http://computer.org/publication/dlib • Optimizing Software inspections, Tom Glib http://216.239.41.104/search?q=cache:P6xvIMfH4VEJ:www.gilb.com/Download/Crosstalk.pdf+how+to+do+software+inspections&hl=en&ie=UTF-8 • Assessing Defect Detection methods for Software requirements Inspections Through External Replication, Filippo Lanubile & Giuseppe Visaggio http://216.239.39.104/search?q=cache:D7H45enpweYJ:www2.umassd.edu/SWPI/ISERN/isern-96-1.pdf+assessing+defect+detection+methods+for+software+requirements+inspections+through+external+replication&hl=en&ie=UTF-8

More Related