Slide1 l.jpg
This presentation is the property of its rightful owner.
Sponsored Links
1 / 35

KPA Ltd. and Tel Aviv University PowerPoint PPT Presentation

S T A M 2000, KPA Ltd. Software Trouble Assessment Matrix ... S T A M 2000, KPA Ltd. Most organizations are moving towards level 2. INITIAL. REPEATABLE ...

Related searches for KPA Ltd. and Tel Aviv University

Download Presentation

KPA Ltd. and Tel Aviv University

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Slide1 l.jpg

Software Trouble Assessment Matrix*This presentation is extracted from SOFTWARE PROCESS QUALITY: Management and Control by Kenett and Baker, Marcel Dekker Inc., 1998. It was first published as "Assessing Software development and Inspection Errors", Quality Progress, pp. 109-112, October 1994 with corrections in the issue of February 1995.

Assessing Software Inspection Processes with STAM*

Ron S. Kenett

KPA Ltd. and Tel Aviv University


Presentation agenda l.jpg

Presentation agenda

  • Software life cycles

  • Inspection processes

  • Measurement programs

  • Assessing software inspection processes


Presentation agenda3 l.jpg

Presentation agenda

  • Software life cycles

  • Inspection processes

  • Measurement programs

  • Assessing software inspection processes


Informal software life cycle l.jpg

Informal software life cycle

Marketing

Requirements

Spec

Acceptance

Test

Spec

System

Test

Spec

System

Requirements

Spec

System

Design

Spec

System

Integration

Spec

Software

Requirements

Spec

Software

Test

Spec

Design and

Construction

Artifacts


Slide5 l.jpg

Test a little ...

Design a little ...

Implement a little ...

Web applications life cycle


Slide6 l.jpg

Change to Requirements?

Change to Requirements?

Change to Requirements?

Requirements stage (Program Statement)

Definition

Gather initial requirements, clarify requirements for understanding

(Draft Requirements Specification)

Analysis

Analyze requirements, categorizeto expose incomplete areas, and prioritize by importance

(Requirements Specification)

Proposal and Project Planning

Proposal stage

Develop Proposal and Project Plans to fulfill project requirements

(Proposal)

Formal software life cycle

(Project Plans)

Design stage

Change Control

(Functional Description)

Update all related documents, code, and tests to reflect the change

(Design)

Code stage

(Code and Unit Test)

(Documentation)

Verification stage

(Technical Testing)

(System Testing)

END


Software life cycle phases l.jpg

Software Life Cycle Phases

Requirements

Analysis

Top Level

Design

Detailed

Design

Acceptance

Tests

System

Tests

Unit

Tests

Program-ming


Presentation agenda8 l.jpg

Presentation agenda

  • Software life cycles

  • Inspection processes

  • Measurement programs

  • Assessing software inspection processes


The software development matrix l.jpg

The software development matrix

Work Products

Work Product

Control

Practices

Key Activities

Work Product

Development

Practices

Inspection

Practices


Sei capability maturity model l.jpg

Depends entirely

on individuals.

None

Initial

Policies, procedures, experience base

Writing-Task Rules, QA Policies,

Inspection Procedures

Repeatable

Defined processes,

peer reviews

Defined

Defect removal, Entry, Exit

Quantitative goals for

product & process

Optimum rates, quality level at

exit & entry, data summary, d-base

Managed

Entire organization.

focused on continuous

process improvement

Defect Prevention Process

Improvements logging,

Owners, Proc. Change Mgt. Team

Optimizing

SEI Capability Maturity Model

Maturity Level

Characteristics

Software Inspection Features

Based on Paulk et al, “Capability Maturity Model Version 1.1”, IEEE Software, July 1993.


Presentation agenda11 l.jpg

Presentation agenda

  • Software life cycles

  • Inspection processes

  • Measurement programs

  • Assessing software inspection processes


Software measurement programs l.jpg

Software Measurement Programs


Measurement program implementation l.jpg

Measurement Program Implementation


Measurement program implementation plan evaluate phase l.jpg

Measurement Program Implementation:Plan/Evaluate Phase

  • 4.5.1 Plan/Evaluate Phase

  • 4.5.1.1 Reasons for implementation

    • Establish a baseline from which to determine trends

    • Quantify how much was delivered in terms the client understands

    • Help in estimating and planning projects

    • Compare the effectiveness and efficiency of current processes, tools, and techniques

    • Identify and proliferate best practices

    • Identify and implement changes that will result in productivity, quality, and cost improvements

    • Establish an ongoing program for continuous improvement

    • Quantitatively prove the success of improvement initiatives

    • Establish better communication with customers

    • Manage budgets for software development more effectively


Measurement program implementation plan evaluate phase15 l.jpg

Measurement Program Implementation:Plan/Evaluate Phase

  • 4.5.1 Plan/Evaluate Phase

  • 4.5.1.2 Questions to help identify goals

    • How fast can we deliver reliable software to our customers? Does it satisfy their requirements?

    • Can we efficiently estimate the development cost and schedule? Are the estimates accurate?

    • What can we do to improve our systems-development life cycle and shorten the cycle time?

    • What is the quality of the software we deliver? Has it improved with the introduction of new tools or techniques?

    • How much are we spending to support existing software? Why does one system cost more than another to support?

    • Which systems should be re-engineered or replaced? When?

    • Should we buy or build new software systems?

    • Are we becoming more effective and efficient at software development? Why? Why not?

    • How can we better leverage our information technology?

    • Has our investment in a particular technology increased our productivity?


Measurement program implementation plan evaluate phase16 l.jpg

Measurement Program Implementation:Plan/Evaluate Phase

  • 4.5.1 Plan/Evaluate Phase

  • 4.5.1.3 Identification of sponsors

  • 4.5.1.4 Identification of roles and responsibilities

    • Who will decide what, how, and when to collect the measurement information?

    • Who will be responsible for collecting the measurement information?

    • How will the data be collected? What standards (internal or external) will be used?

    • At which phases will the data be collected? Where will it be stored?

    • Who will ensure consistency of data reporting and collection?

    • Who will input and maintain the measurement information?

    • Who will report measurement results? When?

    • What will be reported to each level of management?

    • Who will interpret and apply the measurement results?

    • Who is responsible for training?

    • Who will maintain an active interest in the measurement program to ensure full usage of the measurement information?

    • Who will evaluate measurement results and improve the measurement program?

    • Who will ensure adequate funding support?


Measurement program implementation analysis implementation improve phases l.jpg

Measurement Program Implementation:Analysis/Implementation/Improve Phases

  • 4.5.2 Analysis Phase

  • 4.5.2.1 Analysis of audience and identification of target metrics

  • 4.5.2.2 Definition of Software Metrics

  • 4.5.3 Implement/Measure Phase

  • 4.5.3.1 Organizing for Just In Time training and education processes

  • 4.5.3.2 Reporting and publishing results

  • 4.5.4 Improve Phase

  • 4.5.4.1 Managing expectations

  • 4.5.4.2 Managing with metrics


Statistics from formal assessments the tip of the iceberg l.jpg

Statistics from formal assessments“the tip of the iceberg”

Source: SEI, 1994, number of organizations: 261

1997, number of organizations 606


Most organizations are moving towards level 2 l.jpg

REPEATABLE

  • Requirements Management

  • Project Planning

  • Project Tracking & Oversight

  • Subcontract Management

  • Quality Assurance

  • Configuration Management

INITIAL

Most organizations are moving towards level 2


Cmm level 2 key process areas l.jpg

Software Quality Assurance

Requirements Management

SoftwareProjectPlanning

Software Configuration Management

SoftwareProject Tracking and Oversight

Software Subcontract Management

CMM Level 2 Key Process Areas


Software development management dashboard it works only for organizations above level 2 l.jpg

Software Development Management Dashboard“it works only for organizations above level 2”

PP and PTO

PP and PTO

RM

QA

CM


Presentation agenda22 l.jpg

Presentation agenda

  • Software life cycles

  • Inspection processes

  • Measurement programs

  • Assessing software inspection processes


Software trouble assessment matrix l.jpg

Software Trouble Assessment Matrix

  • When were errors detected?

    Depends on the inspection process efficiency - i.e., how it performs

  • When errors could have been detected?

    Depends on the inspection process effectiveness - i.e., how it was designed

  • When were errors created?

    Depends on the overall performance of the software development process


Software life cycle phases24 l.jpg

Software Life Cycle Phases

Requirements

Analysis

Top Level

Design

Detailed

Design

Acceptance

Tests

System

Tests

Unit

Tests

Program-ming


When were errors detected l.jpg

When were errors detected?

7

3

Requirements

Analysis

Top Level

Design

Detailed

Design

13

Acceptance

Tests

2

29

25

31

System

Tests

Unit

Tests

Program-ming


When were errors detected26 l.jpg

When were errors detected?

Life Cycle PhaseNumber of Errors

Requirements Analysis3

Top Level design7

Detailed Design2

Programming25

Unit Tests31

System Tests29

Acceptance Test13

Cumulative profile = S1


When errors could have been detected l.jpg

When errors could have been detected?

Life Cycle PhaseNumber of Errors

Requirements Analysis8

Top Level design14

Detailed Design10

Programming39

Unit Tests8

System Tests26

Acceptance Test5

Cumulative profile = S2


When were errors created l.jpg

When were errors created?

Life Cycle PhaseNumber of Errors

Requirements Analysis34

Top Level design22

Detailed Design17

Programming27

Unit Tests5

System Tests5

Acceptance Test0

Cumulative profile = S3


S1 s2 s3 cumulative profiles l.jpg

S1, S2, S3 cumulative profiles


The software trouble assessment matrix l.jpg

The Software Trouble Assessment Matrix

When were errors created?

When were errors detected?


The s oftware t rouble a ssessment m atrix l.jpg

The Software Trouble Assessment Matrix

When were errors created?

When were errors detected?


Definition of stam metrics l.jpg

Definition of STAM Metrics

Negligence ratio:indicates the amount of errors that

escaped through the inspection process filters.

INSPECTION EFFICIENCY

Evaluation ratio:measures the delay of the

inspection process in identifying errors relative to the

phase in which they occurred.

INSPECTION EFFECTIVENESS

  • Prevention ratio:an index of how early errors

  • are detected in the development life cycle relative

  • to the total number of reported errors.

  • DEVELOPMENT PROCESS EXECUTION


Computation of stam metrics l.jpg

Computation of STAM Metrics

  • Areas under cumulative profiles:

    • S1 = 337

    • S2 = 427

    • S3 = 588

Negligence ratio:100 x (S2 - S1)/S1 = 26.7%

Evaluation ratio:100 x (S3 - S2)/S2 = 37.7%

  • Prevention ratio:100 x S1/(7 x total) = 43.7%


Interpretation of stam metrics l.jpg

Interpretation of STAM Metrics

1. Errors are detected 27% later than they should have been (I.e. if the inspection processes worked perfectly)

2. The design of the inspection processes imply that errors are detected 38% into the phase following their creation.

3. Ideally all errors are requirements errors, and they are detected in the requirements phase. In this example only 47% of this ideal is materialized implying significant opportunities for improvement.


Conclusions l.jpg

Conclusions

  • Inspection processes need to be designed in the context of a software life cycle

  • Inspection processes need to be evaluated using quantitative metrics

  • STAM metrics provide such an evaluation

  • STAM metrics should be integrated in an overall measurement program

Thank you!


  • Login