slide1 l.
Skip this Video
Download Presentation
Team Software Project (TSP) June 12, 2007 Requirements Inspection & High Level Designs

Loading in 2 Seconds...

play fullscreen
1 / 28

Team Software Project (TSP) June 12, 2007 Requirements Inspection & High Level Designs - PowerPoint PPT Presentation

  • Uploaded on

Team Software Project (TSP) June 12, 2007 Requirements Inspection & High Level Designs. Outline. Key Discussions from last week (Project Risks) Configuration Management Schedule & Cross Team Inspections Requirements Overview General Specifics to look for in LOC Counter SRS

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Team Software Project (TSP) June 12, 2007 Requirements Inspection & High Level Designs' - ryanadan

Download Now An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Team Software Project (TSP)

June 12, 2007

Requirements Inspection &

High Level Designs



Key Discussions from last week (Project Risks)

Configuration Management

Schedule & Cross Team Inspections

Requirements Overview


Specifics to look for in LOC Counter SRS

Reviews & Inspection Summary

SRS Inspection Process

Measurement Data Collection

<SRS Inspection>

Design Phase Overview


srs test plan
SRS & Test Plan

System Test manager participates in SRS inspection of other team**

Reviews for clarity and completeness of requirements

Requirements provide basis for test plan

Test plan (baseline next week) inspected by other team**



Requirements (SRS)

Initial Draft out for review June 10

Final draft for inspection June 12

Inspection June 12

Baselined June 15

System Test Plan

Draft out for review June 17

Inspection June 19

Baselined June 22

High Level Design (SDS)

Inspected & Baselined June 19


configuration management process
Configuration Management Process
  • Three aspects:
    • Change Tracking
    • Version Control
    • Library
  • Objectives:
    • Product content known & available at all times
    • Product configuration is documented & provides known basis for changes
    • Products labeled & correlated w/ associated requirements, design & product info
    • Product functions traceable from requirements to delivery
    • All product contents properly controlled & protected
    • Proposed changes identified & evaluated for impact prior to go/no go decisions


configuration management
Configuration Management
  • Types of product elements
    • Maintained (e.g. software)
    • Baselined & Maintained (e.g. SRS)
    • Quality Records (e.g. meeting notes, action item list)
  • Key Functions
    • Latest Version of each product element (software, documents, quality records)
    • Copies of prior versions of each product element
    • Who changed from previous version
    • When changed
    • What changed
    • Why changed


configuration management plan
Configuration Management Plan
  • Configuration identification
    • Name Configuration items
    • Owner
    • Storage location
  • Configuration control procedure
    • Process for making changes, lock-out procedures (e.g. check-out, check-in procedures)
  • Configuration control board (CCB)
    • Reviews all change requests
    • Determine if change is appropriate, well understood & resources available
    • Approvals, commitments
    • ? Defects: holding for CCB vs. urgency of change ?
  • Configuration change request form (CCR, aka CR)
  • Baseline Process (see page 326)
  • Backup procedures & facilities
  • Configuration status reports (CSR)
  • Software Change Management status reports @ weekly meeting


baseline considerations
Baseline Considerations


  • Defined in Configuration Management Plan
  • Review / inspection completed & stakeholders recommend approve for baseline
  • All major and blocking issues should be resolved
  • CRs tracking any remaining (and unresolved) issues


  • Update version # to reflect baselined document (e.g. 1.0)
  • Place under change control

Project “Baseline” – snapshot of CIs, baselined & current versions


automated configuration mgmt
Automated Configuration Mgmt

Lucent: Sablime / SBCS & SCCS

Rational: DDTS / ClearCase

Perforce Software: Perforce

Microsoft: Visual SourceSafe



change workflow
Change Workflow

New / Proposed











requirements phase
Requirements Phase


Completed & Inspected SRS document

Completed Inspection form (INS)

Time, defect & size data collected

Configuration Management Plan*

Updated project notebook

Note: On baselining SRS, the document should be placed under change control


requirements drivers
Requirements Drivers

Functional Needs Statement

SW Requirements Specification







software requirements specification srs
Software Requirements Specification (SRS)


Provide information necessary for understanding the proposed product and to explain/justify the need for various product attributes (user code & documentation)


IEEE610.12 – 1990, IEEE Standard Glossary of Software Engineering Terminology

IEE830 – 1998, IEEE Recommended practice for Software Requirements Specifications

IEEE 1220-1998 – Application and Management of the Systems Engineering Process

IEEE 1233-1998 – Guide for Developing System Requirements Specifications


software requirements statements
Software Requirements Statements
  • Unambiguous:

All involved (e.g. customers, developers, testers) interpret statement in same way

Glossary defining each important term can help

  • Correctness:

describes a condition or attribute that is required of the final product & all agree this is the case

Also, each rqmts statement must be compatible with prior information

  • Verifiable:

Requirement can be verified prior to delivery to customer by inspection or test

To satisfy, use concrete terms and measurable quantities whenever possible

  • Consistency:

Assure individual requirements do not conflict with one another

  • Completeness:

All significant requirements for the product are provided (e.g. input: responses for both valid & invalid data)


software requirements types
Software Requirements Types
  • Functional:

Specific actions that program needs to perform in order to meet users’ needs

Defined or quantified based upon customer expectations

  • Quality:

Various attributes including reliability, usability, efficiency, maintainability, portability, etc.

  • Performance:
  • Regulatory:

Industry Standards (TL9000)

Government/Regulatory (e.g. UL)

  • Security:


security requirements
Security Requirements

Policy: what to secure, against what threats, by what means? Who is authorized?

Confidentiality: preventing unauthorized reading or knowledge

Integrity: preventing unauthorized modification or destruction

Availability: accessible to authorized users

Privileges: controlling access and actions based on authorizations

Identification & authentication: challenging users to prove identity (e.g. passwords, codes)

Correctness: mediation of access to prevent bypassing controls or tampering with system/data

Audit: log to assist in identifying when a security attack has been attempted


requirements identification
Requirements Identification

Requirements should be numbered or labeled

e.g. Requirement XX Start

Requirement XX End

Requirement XX comment

Include release (e.g. cycle) number as part of label

Traceable to functional need statement (see next slide)


requirements traceability
Requirements Traceability

Backwards Traceability includes explicit information that identifies the higher level requirements that the lower level requirement derives from

  • Traceability should cover all phases (e.g. functional need – requirements, requirements – design, design – code, requirements – test)
  • Ensures:
    • nothing is left out of product,
    • change impact assessments

Trace Tables:

Backwards trace table showing link from lower level (e.g. SRS) to higher level (e.g. Strat form)

    • Part of lower level document

Forwards trace table shows lower level requirements derived from an upper level requirement

LOC Project – generate a backwards trace table*


srs document baseline change history
SRS Document Baseline/Change History

Tracks all versions and modifications

Version numbering scheme documented in CM plan

Change request information tracks to CRs


Version 0.1 – Pre-baseline version for review

Version 1.0 – Cycle 1 baseline version

Version 1.1

CR 101 – Clarify security requirements

CR 102 – delete support for VB files

Version 2.0 – Cycle 2 baseline version

Adds the following features ….

Version 2.1 …


srs characteristics summary
SRS Characteristics Summary

Detailed, clearly delineated, concise, unambiguous & testable (quantifiable)




Additions / Enhancements

Requirements should be numbered or labeled

e.g. Requirement XX Start

Requirement XX End

Requirement XX comment

Traceable to functional need statement

Inspected & baselined

Maintained under change control

Document includes structural elements including:

Baseline/change history

Approval page

Customer documentation specifications


loc counter requirements
LOC Counter Requirements

(See also TSPi pp112-113 )

Overall description and framework of GUI (if provided)


File formats (ANSI text) & extensions (.c, .cc) supported

Limits on file names (e.g. max characters)

Additional features (e.g. browsing for input file)

Error cases, one or both files empty, non-existent, unable to be opened

Results of Comparison Algorithm

Output if identical lines moved (e.g. Line A, B, C, D vs. Line A, C, B, D)

Treatment of comments (in-line & alone), blank lines, braces (LOC counting)

Multi-line statements / comments


Format and location of output (e.g. screen, file, directory)


All errors including messages (invalid inputs, algorithm errors, etc.)


Product installation & execution

User documentation plan

Response time


Scalability (e.g. max file sizes supported)


HW requirements (e.g. processor, hard drive, display resolution, OS, peripherals such as mouse)


why do reviews inspections
Why Do Reviews / Inspections?

Can identify defects early in the process

more efficient (i.e. cheaper) defect removal

Leverages knowledge of multiple engineers

Leverages different viewpoints

Improves defect detection odds

Broadens understanding of product being inspected*


inspections vs reviews
Inspections vs Reviews


Formal, typically requires face to face meetings

Measurement data collected

Disposition of product agreed to

Quality records available



Can be face to face, email exchange

Measurement data and quality records optional

Typically used for early product work & small code changes


peer reviews
Peer Reviews

Review Objectives:

Find defects

Improve software element

Consider alternatives

Possibly, educate reviewers


  • Desk Check: informal, typically single peer, effectiveness?
  • Walk-through: informal, several peers, notes taken, data collection optional
  • Variant: test walk-through



Inspection Objectives

Find defects at earliest possible point

Verify to specification (e.g. design to requirements)

Verify to standards

Collect element and process data

Set baseline point

Exit Criteria

All detected defects resolved

Outstanding, non-blocking issues tracked

Techniques & Methods

Generic checklists & standards

Inspectors prepared in advance

Focus on problems, not on resolution

Peers only

“Mandatory” data collection

Roles: Moderator, reader, recorder, inspector, author


inspection logistics
Inspection Logistics

Identify moderator (for TSPi, use process manager)

Inspection briefing (identify inspection roles, set date/time for inspection)

Review product

  • Individual reviews
  • Record time spent reviewing
  • Identify defects, but do not log on LOGD form

(defects recorded during inspection on INS & LOGD forms)

  • Typically want 3-5 days for an adequate review period

Inspection meeting

  • Obtain & record preparation data
  • Step through product one line or section at a time
  • Raise defects or questions
  • Defects recorded by moderator on INS form
  • Defects recorded by producer on LOGD form (no need to use Change Requests)
  • Peripheral issues & action items should be recorded in ITL log*


inspection logistics continued
Inspection Logistics (continued)

Estimate remaining defects

TBD (but, for each defect, record all members who identified it)

Conclude meeting

  • Agree on verification method for defects
  • Agree on disposition (e.g. approved, approved with modification, re-inspect)

Rework product & verify fixes (e.g. moderator)

Obtain signatures of all inspectors on baseline sheet(file as quality record)


measurement data metrics
Measurement Data & Metrics

Base Metrics

# & Type of Defects found (major, minor)

For each defect, who found

# of pages inspected, preparation time (per inspector), inspection time


Preparation rate = # pages / average preparation time

Inspection rate = # pages / inspection time

Inspection defect rate = # major defects / inspection time

Defect density = # estimated defects / # of pages

Inspection yield = # defects / # estimated defects (individual & team)

SRS Phase Defect Containment (%) =

100% * # Defects removed @ step / ( Incoming defects + Injected defects)