1 / 28

Team Software Project (TSP) June 12, 2007 Requirements Inspection & High Level Designs

Team Software Project (TSP) June 12, 2007 Requirements Inspection & High Level Designs. Outline. Key Discussions from last week (Project Risks) Configuration Management Schedule & Cross Team Inspections Requirements Overview General Specifics to look for in LOC Counter SRS

ryanadan
Download Presentation

Team Software Project (TSP) June 12, 2007 Requirements Inspection & High Level Designs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Team Software Project (TSP) June 12, 2007 Requirements Inspection & High Level Designs 2007_06_12_Rqmts.ppt

  2. Outline Key Discussions from last week (Project Risks) Configuration Management Schedule & Cross Team Inspections Requirements Overview General Specifics to look for in LOC Counter SRS Reviews & Inspection Summary SRS Inspection Process Measurement Data Collection <SRS Inspection> Design Phase Overview 2007_06_12_Rqmts.ppt

  3. SRS & Test Plan System Test manager participates in SRS inspection of other team** Reviews for clarity and completeness of requirements Requirements provide basis for test plan Test plan (baseline next week) inspected by other team** 2007_06_12_Rqmts.ppt

  4. Milestones Requirements (SRS) Initial Draft out for review June 10 Final draft for inspection June 12 Inspection June 12 Baselined June 15 System Test Plan Draft out for review June 17 Inspection June 19 Baselined June 22 High Level Design (SDS) Inspected & Baselined June 19 2007_06_12_Rqmts.ppt

  5. Configuration Management Process • Three aspects: • Change Tracking • Version Control • Library • Objectives: • Product content known & available at all times • Product configuration is documented & provides known basis for changes • Products labeled & correlated w/ associated requirements, design & product info • Product functions traceable from requirements to delivery • All product contents properly controlled & protected • Proposed changes identified & evaluated for impact prior to go/no go decisions 2007_06_12_Rqmts.ppt

  6. Configuration Management • Types of product elements • Maintained (e.g. software) • Baselined & Maintained (e.g. SRS) • Quality Records (e.g. meeting notes, action item list) • Key Functions • Latest Version of each product element (software, documents, quality records) • Copies of prior versions of each product element • Who changed from previous version • When changed • What changed • Why changed 2007_06_12_Rqmts.ppt

  7. Configuration Management Plan • Configuration identification • Name Configuration items • Owner • Storage location • Configuration control procedure • Process for making changes, lock-out procedures (e.g. check-out, check-in procedures) • Configuration control board (CCB) • Reviews all change requests • Determine if change is appropriate, well understood & resources available • Approvals, commitments • ? Defects: holding for CCB vs. urgency of change ? • Configuration change request form (CCR, aka CR) • Baseline Process (see page 326) • Backup procedures & facilities • Configuration status reports (CSR) • Software Change Management status reports @ weekly meeting 2007_06_12_Rqmts.ppt

  8. Baseline Considerations Criteria • Defined in Configuration Management Plan • Review / inspection completed & stakeholders recommend approve for baseline • All major and blocking issues should be resolved • CRs tracking any remaining (and unresolved) issues Actions • Update version # to reflect baselined document (e.g. 1.0) • Place under change control Project “Baseline” – snapshot of CIs, baselined & current versions 2007_06_12_Rqmts.ppt

  9. Automated Configuration Mgmt Lucent: Sablime / SBCS & SCCS Rational: DDTS / ClearCase Perforce Software: Perforce Microsoft: Visual SourceSafe MKS 2007_06_12_Rqmts.ppt

  10. Change Workflow New / Proposed Assigned Study NoChange Deferred Declined Resolved Integrated Delivered Verified 2007_06_12_Rqmts.ppt

  11. Requirements Phase Outputs: Completed & Inspected SRS document Completed Inspection form (INS) Time, defect & size data collected Configuration Management Plan* Updated project notebook Note: On baselining SRS, the document should be placed under change control 2007_06_12_Rqmts.ppt

  12. Requirements Drivers Functional Needs Statement SW Requirements Specification Development Test User Documentation Customer 2007_06_12_Rqmts.ppt

  13. Software Requirements Specification (SRS) Objective: Provide information necessary for understanding the proposed product and to explain/justify the need for various product attributes (user code & documentation) Standards: IEEE610.12 – 1990, IEEE Standard Glossary of Software Engineering Terminology IEE830 – 1998, IEEE Recommended practice for Software Requirements Specifications IEEE 1220-1998 – Application and Management of the Systems Engineering Process IEEE 1233-1998 – Guide for Developing System Requirements Specifications 2007_06_12_Rqmts.ppt

  14. Software Requirements Statements • Unambiguous: All involved (e.g. customers, developers, testers) interpret statement in same way Glossary defining each important term can help • Correctness: describes a condition or attribute that is required of the final product & all agree this is the case Also, each rqmts statement must be compatible with prior information • Verifiable: Requirement can be verified prior to delivery to customer by inspection or test To satisfy, use concrete terms and measurable quantities whenever possible • Consistency: Assure individual requirements do not conflict with one another • Completeness: All significant requirements for the product are provided (e.g. input: responses for both valid & invalid data) 2007_06_12_Rqmts.ppt

  15. Software Requirements Types • Functional: Specific actions that program needs to perform in order to meet users’ needs Defined or quantified based upon customer expectations • Quality: Various attributes including reliability, usability, efficiency, maintainability, portability, etc. • Performance: • Regulatory: Industry Standards (TL9000) Government/Regulatory (e.g. UL) • Security: 2007_06_12_Rqmts.ppt

  16. Security Requirements Policy: what to secure, against what threats, by what means? Who is authorized? Confidentiality: preventing unauthorized reading or knowledge Integrity: preventing unauthorized modification or destruction Availability: accessible to authorized users Privileges: controlling access and actions based on authorizations Identification & authentication: challenging users to prove identity (e.g. passwords, codes) Correctness: mediation of access to prevent bypassing controls or tampering with system/data Audit: log to assist in identifying when a security attack has been attempted 2007_06_12_Rqmts.ppt

  17. Requirements Identification Requirements should be numbered or labeled e.g. Requirement XX Start Requirement XX End Requirement XX comment Include release (e.g. cycle) number as part of label Traceable to functional need statement (see next slide) 2007_06_12_Rqmts.ppt

  18. Requirements Traceability Backwards Traceability includes explicit information that identifies the higher level requirements that the lower level requirement derives from • Traceability should cover all phases (e.g. functional need – requirements, requirements – design, design – code, requirements – test) • Ensures: • nothing is left out of product, • change impact assessments Trace Tables: Backwards trace table showing link from lower level (e.g. SRS) to higher level (e.g. Strat form) • Part of lower level document Forwards trace table shows lower level requirements derived from an upper level requirement LOC Project – generate a backwards trace table* 2007_06_12_Rqmts.ppt

  19. SRS Document Baseline/Change History Tracks all versions and modifications Version numbering scheme documented in CM plan Change request information tracks to CRs e.g. Version 0.1 – Pre-baseline version for review Version 1.0 – Cycle 1 baseline version Version 1.1 CR 101 – Clarify security requirements CR 102 – delete support for VB files Version 2.0 – Cycle 2 baseline version Adds the following features …. Version 2.1 … 2007_06_12_Rqmts.ppt

  20. SRS Characteristics Summary Detailed, clearly delineated, concise, unambiguous & testable (quantifiable) Changes Defects Clarifications Additions / Enhancements Requirements should be numbered or labeled e.g. Requirement XX Start Requirement XX End Requirement XX comment Traceable to functional need statement Inspected & baselined Maintained under change control Document includes structural elements including: Baseline/change history Approval page Customer documentation specifications 2007_06_12_Rqmts.ppt

  21. LOC Counter Requirements (See also TSPi pp112-113 ) Overall description and framework of GUI (if provided) Input File formats (ANSI text) & extensions (.c, .cc) supported Limits on file names (e.g. max characters) Additional features (e.g. browsing for input file) Error cases, one or both files empty, non-existent, unable to be opened Results of Comparison Algorithm Output if identical lines moved (e.g. Line A, B, C, D vs. Line A, C, B, D) Treatment of comments (in-line & alone), blank lines, braces (LOC counting) Multi-line statements / comments Output Format and location of output (e.g. screen, file, directory) Errors All errors including messages (invalid inputs, algorithm errors, etc.) Other Product installation & execution User documentation plan Response time Security Scalability (e.g. max file sizes supported) Concurrency HW requirements (e.g. processor, hard drive, display resolution, OS, peripherals such as mouse) 2007_06_12_Rqmts.ppt

  22. Why Do Reviews / Inspections? Can identify defects early in the process more efficient (i.e. cheaper) defect removal Leverages knowledge of multiple engineers Leverages different viewpoints Improves defect detection odds Broadens understanding of product being inspected* 2007_06_12_Rqmts.ppt

  23. Inspections vs Reviews Inspections Formal, typically requires face to face meetings Measurement data collected Disposition of product agreed to Quality records available Reviews Informal Can be face to face, email exchange Measurement data and quality records optional Typically used for early product work & small code changes 2007_06_12_Rqmts.ppt

  24. Peer Reviews Review Objectives: Find defects Improve software element Consider alternatives Possibly, educate reviewers Types: • Desk Check: informal, typically single peer, effectiveness? • Walk-through: informal, several peers, notes taken, data collection optional • Variant: test walk-through 2007_06_12_Rqmts.ppt

  25. Inspections Inspection Objectives Find defects at earliest possible point Verify to specification (e.g. design to requirements) Verify to standards Collect element and process data Set baseline point Exit Criteria All detected defects resolved Outstanding, non-blocking issues tracked Techniques & Methods Generic checklists & standards Inspectors prepared in advance Focus on problems, not on resolution Peers only “Mandatory” data collection Roles: Moderator, reader, recorder, inspector, author 2007_06_12_Rqmts.ppt

  26. Inspection Logistics Identify moderator (for TSPi, use process manager) Inspection briefing (identify inspection roles, set date/time for inspection) Review product • Individual reviews • Record time spent reviewing • Identify defects, but do not log on LOGD form (defects recorded during inspection on INS & LOGD forms) • Typically want 3-5 days for an adequate review period Inspection meeting • Obtain & record preparation data • Step through product one line or section at a time • Raise defects or questions • Defects recorded by moderator on INS form • Defects recorded by producer on LOGD form (no need to use Change Requests) • Peripheral issues & action items should be recorded in ITL log* 2007_06_12_Rqmts.ppt

  27. Inspection Logistics (continued) Estimate remaining defects TBD (but, for each defect, record all members who identified it) Conclude meeting • Agree on verification method for defects • Agree on disposition (e.g. approved, approved with modification, re-inspect) Rework product & verify fixes (e.g. moderator) Obtain signatures of all inspectors on baseline sheet(file as quality record) 2007_06_12_Rqmts.ppt

  28. Measurement Data & Metrics Base Metrics # & Type of Defects found (major, minor) For each defect, who found # of pages inspected, preparation time (per inspector), inspection time Measures Preparation rate = # pages / average preparation time Inspection rate = # pages / inspection time Inspection defect rate = # major defects / inspection time Defect density = # estimated defects / # of pages Inspection yield = # defects / # estimated defects (individual & team) SRS Phase Defect Containment (%) = 100% * # Defects removed @ step / ( Incoming defects + Injected defects) 2007_06_12_Rqmts.ppt

More Related