24th international forum on cocomo and systems software cost modeling november 2 2009 l.
Skip this Video
Loading SlideShow in 5 Seconds..
24th International Forum on COCOMO and Systems/Software Cost Modeling November 2, 2009 PowerPoint Presentation
Download Presentation
24th International Forum on COCOMO and Systems/Software Cost Modeling November 2, 2009

Loading in 2 Seconds...

play fullscreen
1 / 27

24th International Forum on COCOMO and Systems/Software Cost Modeling November 2, 2009 - PowerPoint PPT Presentation

  • Uploaded on

A Sizing Framework for DoD Software Cost Analysis Raymond Madachy, NPS Barry Boehm, Brad Clark and Don Reifer, USC Wilson Rosa, AFCAA rjmadach@nps.edu, boehm@usc.edu, brad@software-metrics.com, dreifer@earthlink.net, wilson.rosa@pentagon.af.mil .

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

24th International Forum on COCOMO and Systems/Software Cost Modeling November 2, 2009

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
24th international forum on cocomo and systems software cost modeling november 2 2009

A Sizing Framework for DoD Software Cost AnalysisRaymond Madachy, NPSBarry Boehm, Brad Clark and Don Reifer, USCWilson Rosa, AFCAArjmadach@nps.edu, boehm@usc.edu, brad@software-metrics.com, dreifer@earthlink.net, wilson.rosa@pentagon.af.mil

24th International Forum on COCOMO and Systems/Software Cost ModelingNovember 2, 2009


Project Overview (Dr. Wilson Rosa)

Data Analysis

Software Sizing


project background
Project Background

Goal is to improve the quality and consistency of estimating methods across cost agencies and program offices through guidance, standardization, and knowledge sharing.

Project led by the Air Force Cost Analysis Agency (AFCAA) working with service cost agencies, and assisted by University of Southern California and Naval Postgraduate School

We will publish the AFCAASoftware Cost Estimation Metrics Manual to help analysts and decision makers develop accurate, easy and quick software cost estimates for avionics, space, ground, and shipboard platforms.

stakeholder communities
Stakeholder Communities


  • Research is collaborative across heterogeneous stakeholder communities who have helped us in refining our data definition framework, domain taxonomy and providing us project data.
    • Government agencies
    • Tool Vendors
    • Industry
    • Academia

TruePlanning® by PRICE Systems


research objectives
Research Objectives

Establish a robust and cost effective software metrics collection process and knowledge base that supports the data needs of the United States Department of Defense (DoD)

Enhance the utility of the collected data to program oversight and management

Support academic and commercial research into improved cost estimation of future DoD software-intensive systems

software cost model calibration
Software Cost Model Calibration
  • Most program offices and support contractors rely heavily on software cost models
  • May have not been calibrated with most recent DoD data
  • Calibration with recent data (2002-Present) will help increase program office estimating accuracy
afcaa software cost estimation metrics manual table of contents
AFCAA Software Cost Estimation Metrics ManualTable of Contents
  • Chapter 1: Software Estimation Principles
  • Chapter 2: Product Sizing
  • Chapter 3: Product Growth
  • Chapter 4: Effective SLOC
  • Chapter 5: Historical Productivity
  • Chapter 6: Model Calibration
  • Chapter 7: Calibrated SLIM-ESTIMATE
  • Chapter 8: Cost Risk and Uncertainty Metrics
  • Chapter 9: Data Normalization
  • Chapter 10: Software Resource Data Report
  • Chapter 11: Software Maintenance
  • Chapter 12: Lessons Learned
manual special features
Manual Special Features
  • Augment NCCA/AFCAA Software Cost Handbook:
    • Default Equivalent Size Inputs (DM, CM, IM, SU, AA, UNFM)
    • Productivity Benchmarks by Operating Environment, Application Domain, and Software Size
    • Empirical Code, Effort, and Schedule Growth Measures derived from SRDRs
    • Empirically Based Cost Risk and Uncertainty Analysis Metrics
    • Calibrated SLIM-Estimate™ using most recent SRDR data
    • Mapping between COCOMO, SEER, True S cost drivers
    • Empirical Dataset for COCOMO, True S, and SEER Calibration
    • Software Maintenance Parameters
manual special features cont
Manual Special Features (Cont.)
  • Guidelines for reconciling inconsistent data
  • Standard Definitions (Application Domain, SLOC, etc.)
  • Address issues related to incremental development (overlaps, early-increment breakage, integration complexity growth, deleted software, relations to maintenance) and version management (a form of product line development and evolution).
  • Impact of Next Generation Paradigms – Model Driven Architecture, Net-Centricity, Systems of Systems, etc.

Project Overview (Dr. Wilson Rosa)

Data Analysis

Software Sizing


dod empirical data
DoD Empirical Data
  • Data quality and standardization issues
    • No reporting of Equivalent Size Inputs – CM, DM, IM, SU, AA, UNFM, Type
    • No common SLOC reporting – logical, physical, etc.
    • No standard definitions – Application Domain, Build, Increment, Spiral,…
    • No common effort reporting – analysis, design, code, test, CM, QA,…
    • No common code counting tool
    • Product size only reported in lines of code
    • No reporting of quality measures – defect density, defect containment, etc.
  • Limited empirical research within DoD on other contributors to productivity besides effort and size:
    • Operating Environment, Application Domain, and Product Complexity
    • Personnel Capability
    • Required Reliability
    • Quality – Defect Density, Defect Containment
    • Integrating code from previous deliveries – Builds, Spirals, Increments, etc.
    • Converting to Equivalent SLOC
  • Categories like Modified, Reused, Adopted, Managed, and Used add no value unless they translate into single or unique narrow ranges of DM, CM, and IM parameter values. We have seen no empirical evidence that they do…
data collection and analysis
Data Collection and Analysis


Be sensitive to the application domain

Embrace the full life cycle and Incremental Commitment Model

Be able to collect data by phase, project and/or build or increment

Items to collect

SLOC reporting – logical, physical, NCSS, etc.

Requirements Volatility and Reuse

Modified or Adopted using DM, CM, IM; SU, UNFM as appropriate

Definitions for Application Types, Development Phase, Lifecycle Model,…

Effort reporting – phase and activity

Quality measures – defects, MTBF, etc.

data normalization strategy
Data Normalization Strategy

Interview program offices and developers to obtain additional information not captured in SRDRs…

Modification Type – auto generated, re-hosted, translated, modified

Source – in-house, third party, Prior Build, Prior Spiral, etc.

Degree-of-Modification – %DM, %CM, %IM; SU, UNFM as appropriate

Requirements Volatility -- % of ESLOC reworked or deleted due to requirements volatility

Method – Model Driven Architecture, Object-Oriented, Traditional

Cost Model Parameters – True S, SEER, COCOMO


Project Overview (Dr. Wilson Rosa)

Data Analysis

Software Sizing


size issues and definitions
Size Issues and Definitions
  • An accurate size estimate is the most important input to parametric cost models.
  • Desire consistent size definitions and measurements across different models and programming languages
  • The sizing chapter addresses these:
    • Common size measures defined and interpreted for all the models
    • Guidelines for estimating software size
    • Guidelines to convert size inputs between models so projects can be represented in in a consistent manner
  • Using Source Lines of Code (SLOC) as common measure
    • Logical source statements consisting of data declarations executables
    • Rules for considering statement type, how produced, origin, build, etc.
    • Providing automated code counting tools adhering to definition
    • Providing conversion guidelines for physical statements
  • Addressing other size units such as requirements, use cases, etc.
sizing framework elements
Sizing Framework Elements
  • Core software size type definitions
    • Standardized data collection definitions
      • Measurements will be invariant across cost models and data collections venues
    • Project data normalized to these definitions
      • Translation tables for non-compliant data sources
  • SLOC definition and inclusion rules
  • Equivalent SLOC parameters
  • Cost model Rosetta Stone size translations
  • Other size unit conversions (e.g. function points, use cases, requirements)
equivalent sloc a user perspective
Equivalent SLOC – A User Perspective *
  • “Equivalent” – A way of accounting for relative work done to generate software relative to the code-counted size of the delivered software
  • “Source” lines of code: The number of logical statements prepared by the developer and used to generate the executing code
    • Usual Third Generation Language (C, Java): count logical 3GL statements
    • For Model-driven, Very High Level Language, or Macro-based development: count statements that generate customary 3GL code
    • For maintenance above the 3GL level: count the generator statements
    • For maintenance at the 3GL level: count the generated 3GL statements
  • Two primary effects: Volatility and Reuse
    • Volatility: % of ESLOC reworked or deleted due to requirements volatility
    • Reuse: either with modification (modified) or without modification (adopted)

* Stutzke, Richard D, Estimating Software-Intensive Systems, Upper Saddle River, N.J.: Addison Wesley, 2005

adapted software parameters
Adapted Software Parameters
  • For adapted software, apply the parameters:
    • DM: % of design modified
    • CM: % of code modified
    • IM: % of integration required compared to integrating new code
    • Normal Reuse Adjustment Factor RAF = 0.4*DM + 0.3*CM + 0.3*IM
  • Reused software has DM = CM = 0.
  • Modifiedsoftware has CM > 0. Since data indicates that the RAF factor tends to underestimate modification effort due to added software understanding effects, two other factors are used:
    • Software Understandability (SU): How understandable is the software to be modified?
    • Unfamiliarity (UNFM): How unfamiliar with the software to be modified is the person modifying it?
equivalent sloc rules
Equivalent SLOC Rules

Equivalent SLOC Rules for Maintenance

Equivalent SLOC Rules for Development


Project Overview (Dr. Wilson Rosa)

Data Analysis

Software Sizing


concluding remarks
Concluding Remarks

Goal is to publish a manual to help analysts develop quick software estimates using empirical metrics from recent programs

Additional information is crucial for improving data quality across DoD

We want your input on Productivity Domains and Data Definitions

Looking for collaborators

Looking for peer-reviewers

Need more data

  • United States Department of Defense (DoD), “Instruction 5000.2, Operation of the Defense Acquisition System”, December 2008.
  • W. Rosa, B. Clark, R. Madachy, D. Reifer, and B. Boehm, “Software Cost Metrics Manual”, Proceedings of the 42nd Department of Defense Cost Analysis Symposium, February 2009.
  • B. Boehm, “Future Challenges for Systems and Software Cost Estimation”, Proceedings of the 13th Annual Practical Software and Systems Measurement Users’ Group Conference, June 2009.
  • B. Boehm, C. Abts, W. Brown, S. Chulani, B. Clark, E. Horowitz, R. Madachy, D. Reifer, and B. Steece, Software Cost Estimation with COCOMO II, Upper Saddle River, NJ: Prentice-Hall, 2000.
  • R. Stutzke, Estimating Software-Intensive Systems, Upper Saddle River, NJ: Addison Wesley, 2005.
  • Madachy R, Boehm B, “Comparative Analysis of COCOMO II, SEER-SEM and True-S Software Cost Models”, USC-CSSE-2008-816, University of Southern California Center for Systems and Software Engineering, 2008.