1 / 23

Team Software Project (TSP) June 19, 2007 High Level Designs, Code Inspections & Measurement

Team Software Project (TSP) June 19, 2007 High Level Designs, Code Inspections & Measurement. Outline. High Level Design Phase Review Inspection Questions Power Code Inspection Measurement System Test Plan Review Next Phases (Implementation & Test). Due Today.

ima
Download Presentation

Team Software Project (TSP) June 19, 2007 High Level Designs, Code Inspections & Measurement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Team Software Project (TSP) June 19, 2007 High Level Designs, Code Inspections & Measurement SE 652- 2007_06_19_Overview_Inspections.ppt

  2. Outline High Level Design Phase Review Inspection Questions Power Code Inspection Measurement System Test Plan Review Next Phases (Implementation & Test) SE 652- 2007_06_19_Overview_Inspections.ppt

  3. Due Today Completed & Inspected High Level Design (SDS) & Integration Test Plan Completed & Inspected Configuration Management Plan SRS Inspection Quality Records (LOGD, INS, Baseline Sheets) Completed & Inspected Integration Test Plans (incl. LOGD, INS, Baseline forms) Completed and Inspected Coding, naming & design standards (incl. LOGD, INS, Baseline forms) System Test Plans (draft for inspection) Updated project schedule & measurement data collected Time, defect & size data collected Updated project notebook SE 652- 2007_06_19_Overview_Inspections.ppt

  4. Software Design Specification (SDS) Input Conceptual Design Requirements (SRS) Design Objectives Principal parts How the parts interact How they are put together SE 652- 2007_06_19_Overview_Inspections.ppt

  5. Design Standards Naming conventions Function, File, Variable, Parameter Names Defines, Globals, Public, Statics, etc. Interface formats Variable handling Error codes System & Error messages Defect standards Severities Defect types Root cause bucketing LOC counting SE 652- 2007_06_19_Overview_Inspections.ppt

  6. Design for Reuse Reusable functions should be: Self contained Cleanly isolated Clearly & concisely documented (usage, interfaces, returns, errors) Examples of successful reusable components? SE 652- 2007_06_19_Overview_Inspections.ppt

  7. Design for Testability Unit test harnesses Simulation testing Black box testing Verify program’s external interfaces White box testing Also considers program’s logical paths & structure Typically requires special tools (e.g. code coverage) Typically requires supporting code SE 652- 2007_06_19_Overview_Inspections.ppt

  8. Integration Test Plan Objective Verify all system component interfaces Activity Review all interfaces (as defined in SDS) Specify how to test them Recommend: inspecting SDS & Integration Test Plan simultaneously Could also combine into a single document SE 652- 2007_06_19_Overview_Inspections.ppt

  9. System Test Plan Areas to cover: Installation Start-up All required functions available & working as specified Diabolical (e.g. power failures, corner cases, incorrect handling) Performance Usability Includes: Test cases you plan to run (numbered / named) Expected results Ordering of testing & dependencies Supporting materials needed Traceability to requirements SE 652- 2007_06_19_Overview_Inspections.ppt

  10. Power! What is it? “probability that one actor within a social relationship will be in a position to carry out his own will despite resistance” – Max Weber “interpersonal relationship in which one individual (or group) has the ability to cause another individual (or group) to take an action that would not be taken otherwise” – Steers & Black SE 652- 2007_06_19_Overview_Inspections.ppt

  11. Types of Power Referent/Charismatic Based on personal qualities, characteristics, reputation Expert Knowledge or expertise relevant to person Power only in domain of expertise Legitimate Person has the right to exert power in a specified domain Reward Controls rewards a person wants Coercive Based on fear, person can administer punishment SE 652- 2007_06_19_Overview_Inspections.ppt

  12. Effectiveness Which types of power are most effective? Power within the class teams? SE 652- 2007_06_19_Overview_Inspections.ppt

  13. Inspections Inspection Objectives Find defects at earliest possible point Verify to specification (e.g. design to requirements) Verify to standards Collect element and process data Set baseline point Exit Criteria All detected defects resolved Outstanding, non-blocking issues tracked Techniques & Methods Generic checklists & standards Inspectors prepared in advance Focus on problems, not on resolution Peers only “Mandatory” data collection Roles: Moderator, reader, recorder, inspector, author SE 652- 2007_06_19_Overview_Inspections.ppt

  14. Inspection Logistics Identify moderator (for TSPi, use process manager) Inspection briefing (identify inspection roles, set date/time for inspection) Review product • Individual reviews • Record time spent reviewing • Identify defects, but do not log on LOGD form (defects recorded during inspection on INS & LOGD forms) • Typically want 3-5 days for an adequate review period Inspection meeting • Obtain & record preparation data • Step through product one line or section at a time • Raise defects or questions • Defects recorded by moderator on INS form • Defects recorded by producer on LOGD form (no need to use Change Requests) • Peripheral issues & action items should be recorded in ITL log* SE 652- 2007_06_19_Overview_Inspections.ppt

  15. Inspection Logistics (continued) Estimate remaining defects TBD (but, for each defect, record all members who identified it) Conclude meeting • Agree on verification method for defects • Agree on disposition (e.g. approved, approved with modification, re-inspect) Rework product & verify fixes (e.g. moderator) Obtain signatures of all inspectors on baseline sheet(file as quality record) SE 652- 2007_06_19_Overview_Inspections.ppt

  16. Spelling Code Inspection

  17. Measurement Data & Metrics Base Metrics # & Type of Defects found (major, minor) For each defect, who found # of pages inspected, preparation time (per inspector), inspection time Measures Preparation rate = # pages / average preparation time Inspection rate = # pages / inspection time Inspection defect rate = # major defects / inspection time Defect density = # estimated defects / # of pages Inspection yield = # defects / # estimated defects (individual & team) SRS Phase Defect Containment (%) = 100% * # Defects removed @ step / ( Incoming defects + Injected defects) SE 652- 2007_06_19_Overview_Inspections.ppt

  18. InspectionsEstimating Defects Capture-ReCapture Example Catch 20 fish in lake, tag & release them Catch 25 more, 5 are tagged How many fish are in the lake? 5 out of 25 = 20 out of Total Population Total = ? SE 652- 2007_06_19_Overview_Inspections.ppt

  19. Capture-ReCapture Formula Fishing example: 5 out of 25 = 20 out of Total C = # from both fishing tries (e.g. 5) A = # from first fishing try (e.g. 20) B = # from second fishing try (e.g. 25) So, C out of B = A out of Total C/B = A/Total Total = A*B/C SE 652- 2007_06_19_Overview_Inspections.ppt

  20. Estimating Defects2 Developer Case C = # from both tries = # found by both developers A = # from first try = # found by developer A B = # from second try = # found by developer B Total # defects = # A * # B / # both found Yield = # found / Total # defects expressed as percentage = 100 * (A*B-C) / (A*B/C) = 100 * (A*B-C)*C / A*B Humphrey 2 Developer Example: Two developers, A found 7, B found 5, common defects 3 Total estimated # defects = (7*5)/3 = 12 Yield = 9 / 12 = 75% SE 652- 2007_06_19_Overview_Inspections.ppt

  21. Estimating Defects3 Developer Example Three developers in an inspection identified 10 unique defects (# from 1 to 10). Harry found defects 1, 2, 3, 4 & 5 Chapin found defects 1, 2, 4, 6 & 7 Sue found defects 4, 6, 7, 8, 9 & 10 Estimate total # of defects in product prior to inspection. Estimate total inspection yield. Sue identified the most unique defects (3) = 6 identified Combine Harry’s & Chapin’s defects = 7 identified, 3 in common w/ Sue Total Product Defects = 6 * 7 / 3 = 14 Yield % = 100 * 10/14 = 71% SE 652- 2007_06_19_Overview_Inspections.ppt

  22. Capture-ReCapture Assumptions & Cautions • Population is homogeneous • Population is randomly distributed • Sample #s are reasonably large SE 652- 2007_06_19_Overview_Inspections.ppt

  23. Backup Slides

More Related