1 / 11

Software Cost Estimation Metrics Manual

Software Cost Estimation Metrics Manual. 26 th International Forum on COCOMO and Systems/Software Cost Modeling Ray Madachy, Barry Boehm, Brad Clark, Thomas Tan, Wilson Rosa November 2, 2011. Project Background .

tanika
Download Presentation

Software Cost Estimation Metrics Manual

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Cost EstimationMetrics Manual 26th International Forum on COCOMO and Systems/Software Cost Modeling Ray Madachy, Barry Boehm, Brad Clark, Thomas Tan, Wilson Rosa November 2, 2011

  2. Project Background Goal is to improve the quality and consistency of estimating methods across cost agencies and program offices through guidance, standardization, and knowledge sharing. Project led by the Air Force Cost Analysis Agency (AFCAA) working with service cost agencies, and assisted by University of Southern California and Naval Postgraduate School We will publish the AFCAA Software Cost Estimation Metrics Manual to help analysts and decision makers develop accurate, easy and quick software cost estimates for avionics, space, ground, and shipboard platforms.

  3. Stakeholder Communities SLIM-Estimate™ TruePlanning® by PRICE Systems • Research is collaborative across heterogeneous stakeholder communities who have helped us in refining our data definition framework, domain taxonomy and providing us project data. • Government agencies • Tool Vendors • Industry • Academia

  4. Research Objectives • Make collected data useful to oversight and management entities • Provide guidance on how to condition data to address challenges • Segment data into different Application Domains and Operating Environments • Analyze data for simple Cost Estimating Relationships (CER) and Schedule Estimating Relationships (SER) within each domain • Develop rules-of-thumb for missing data Productivity-based CER Cost (Effort) = Size * (Size / PM) Data Conditioning and Analysis Model-based CER/SER Cost (Effort) = a * Sizeb Data Records for one Domain Schedule = a * Sizeb * Staffc

  5. Data Source • The DoD’sSoftware Resources Data Report (SRDR) is used to obtain both the estimated and actual characteristics of new software developments or upgrades. • All contractors, developing or producing any software development element with a projected software effort greater than $20M (then year dollars) on major contracts and subcontracts within ACAT I and ACAT IA programs, regardless of contract type, must submit SRDRs. • Reports mandated for • Initial Government Estimate • Initial Developer Estimate (after contract award) • Final Developer (actual values)

  6. Data Analysis Challenges • Software Sizing • Inadequate information on modified code (only size provided) • Inadequate information on size change or growth, e.g. deleted • Size measured inconsistently (total, non-comment, logical statements) • Development Effort • Inadequate information on average staffing or peak staffing • Inadequate information on personnel experience • Missing effort data for some activities • Development Schedule • Replicated duration for multi-build components • Inadequate information on schedule compression • Missing schedule data for some phases • Quality Measures • None (optional in SRDR)

  7. Data Conditioning • Segregate data to isolate variations • Productivity Types (revised from Application Domains) • Operating Environments • Sizing Issues • Developed conversion factors for different line counts to logical line count • Interviews with data contributors recovered modification parameters for 44% of records • Missing effort data: use distribution by Productivity Type to fill in missing data (to possibly be improved using Tom Tan’s approach) • Missing schedule data: use distribution by Productivity Type to fill in missing data

  8. Environments • Productivity types can be influenced by their environment. • Different Environments are grouped into 11 categories • Fixed Ground (FG) • Ground Surface Vehicles • Manned (MGV) • Unmanned (UGV) • Sea Systems • Manned (MSV) • Unmanned (USV) • Aircraft • Manned (MAV) • Unmanned (UAV) • Missile / Ordnance (M/O) • Spacecraft • Manned (MSC) • Unmanned (USC)

  9. Productivity Types Different productivities have been observed for different software application types. Software project data is segmented into 14 productivity types to increase the accuracy of estimating cost and schedule • Sensor Control andSignal Processing (SSP) • Vehicle Control (VC) • Real Time Embedded (RTE) • Vehicle Payload (VP) • Mission Processing (MP) • Command & Control (C&C) • System Software (SS) • Telecommunications (TEL) • Process Control (PC) • Scientific Systems (SCI) • Training (TRN) • Test Software (TST) • Software Tools (TUL) • Business Systems (BIS) 9

  10. CER/SER Reporting Mean productivity for each type Productivity for each type and environment Mean productivity for each environment

  11. Current Status • Completing the Software Cost Estimation Metrics Manual • Initially available as an online wiki • Ability to print online version available • Inviting Reviewer comments using the online wiki “Discussion” capability • First version of manual ready Dec 31st, 2011 11

More Related