1 / 35

Measuring Dollar Savings from Software Process Improvement with COCOMO II

Measuring Dollar Savings from Software Process Improvement with COCOMO II. Betsy Clark Software Metrics Inc. October 25, 2001 Acknowledgment: This presentation describes work being done by TeraQuest Metrics. Outline. Background Measuring the Impact of Software Process Improvement (SPI)

denis
Download Presentation

Measuring Dollar Savings from Software Process Improvement with COCOMO II

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring Dollar Savings from Software Process Improvement with COCOMO II Betsy Clark Software Metrics Inc. October 25, 2001 Acknowledgment: This presentation describes work being done by TeraQuest Metrics

  2. Outline • Background • Measuring the Impact of Software Process Improvement (SPI) • Some Initial Results

  3. Customer Background • Large financial institution • Actively involved in software process improvement (SPI) • Software-CMM • System Test • Began summer of 2000 at CMM Level 1 • Incrementally adding Key Process Areas • Two pilot organizations • Planning Level 2 assessment end of this year

  4. Background (continued) • Strong emphasis on measuring impact of SPI, especially hard dollar savings • CIO: “If process improvement saves us money, I should be able to go down the street to my competitor’s bank and get a loan to fund our process improvement initiative.”

  5. Outline • Background • Measuring the Impact of Software Process Improvement (SPI) • Some Initial Results • Conclusions

  6. “Maturity levels are meaningless if they cannot be explained in terms of business objectives” John D. VuBoeingLevel 5 Organization

  7. Business Objectives • Reduce the cost of software activities • Reduce delivery time • Improve product quality • Increase customer satisfaction • customers are internal to the bank (e.g., wholesale and retail mortgage, investment division)

  8. Measurement Objectives • Measure impact of SPI in terms of these business objectives • Impacts of SPI will be measured by comparing a set of baseline projects to pilot projects

  9. Measuring Hard Savings • CFO’s initial understanding - • “If we have savings from SPI, we can reduce IT budget in the future.” • First point of discussion - need to measure work load • Led to concept of unit savings, holding IT organization accountable for those savings • Brought IT manager into the discussion - • “But events occur outside of my control that can affect unit costs. For example, I can lose my top staff.”

  10. Measuring Hard Savings • The IT manager was talking about variability due to factors outside of SPI. • That variability is addressed by parametric cost models. • Approach - measure COCOMO II cost drivers for baseline projects and for SPI projects. Use them to adjust unit costs. • Backout all influences on unit costs except SPI

  11. Measuring Hard Savings (cont) • Savings due to SPI • Difference in adjusted unit costs between baseline and SPI projects

  12. Setting Expectations • SPI is a staged, long term initiative • implemented on pilot projects first, then on a wider scale • Initially, we will estimate savings based on pilot results • few data points, wide variation • As SPI is implemented on a wider scale, we will have more data points, clearer trends • Moving from CMM Level 1 to Level 2 lays the foundation for unit cost savings • a few studies do show cost savings from Level 1 to 2 • major effect is in better estimation and planning • reduction in rework due to stable requirements

  13. Measures 1) estimation accuracy: effort 2) estimation accuracy: schedule 3) productivity 4) unit costs 5) project delivery rate (cycle time) 6) system test effectiveness 7) delivered defect density 8) customer satisfaction 9) requirements volatility

  14. Approach • Attempted to “mine” existing data sources (e.g., time tracking, financial, problem reporting systems) • not successful, sporadic and inconsistently used • Selected a representative set of completed projects from the two pilot organizations • Goal was 10-15 projects per pilot organization • 13 projects from one • 11 from the other • Constructed a survey, met with project managers to collect data • Followed-up with each manager to verify data

  15. Measures 1) estimation accuracy: effort 2) estimation accuracy: schedule 3) productivity 4) unit costs 5) project delivery rate (cycle time) 6) system test effectiveness 7) delivered defect density 8) customer satisfaction 9) requirements volatility

  16. Estimation Accuracy - Effort Calculation: (Actual labor hours - estimated) / estimated Overruns Percent difference between actual and estimated 0 Underruns Planned Labor Hours

  17. Estimation Accuracy - Schedule Calculation: (Actual calendar months - estimated) / estimated Overruns Percent Difference between actual and estimated 0 Underruns Planned duration

  18. Measures of Interest • Median - very stable across organizations • standard deviation • Goals with SPI: • median should approach zero • standard deviation should be smaller

  19. Measures 1) estimation accuracy: effort 2) estimation accuracy: schedule 3) productivity 4) unit costs 5) project delivery rate (cycle time) 6) system test effectiveness 7) delivered defect density 8) customer satisfaction 9) requirements volatility

  20. Productivity and Unit Costs • High variability • Median is stable across divisions

  21. Initial Results • Used COCOMO II parameters to adjust size • Led to a reduction in the standard deviation • Helped explain: • why lower productivity projects had difficulty • why higher productivity projects had an easier time • Projects with very high productivity seemed to do everything right • capable staff, low turnover, managing requirements… • these are good things that should improve with SPI • don’t want to penalize organization for improvement in these other (non-SPI) areas • management controllables vs noncontrollables

  22. Measures • 1) estimation accuracy: effort • 2) estimation accuracy: schedule • 3) productivity • 4) unit costs • 5) project delivery rate (cycle time) • 6) system test effectiveness • 7) delivered defect density • 8) customer satisfaction • 9) requirements volatility

  23. Project Delivery Rate • Calculation: • Function points / calendar months • Goal: Increasing

  24. Project Delivery Rate Function points per calendar months Function Points

  25. Measures 1) estimation accuracy: effort 2) estimation accuracy: schedule 3) productivity 4) unit costs 5) project delivery rate (cycle time) 6) system test effectiveness 7) delivered defect density 8) customer satisfaction 9) requirements volatility

  26. System Test Effectiveness • Calculation: • (Defects Found in System Test / Total Defects) • where • Total Defects = (Defects Found in System Test + Delivered Defects found in first 30 days) • Example: • Defects found in System Test = 45 • Defects found in first 30 days of operations = 5 • Test Effectiveness = 90% • Goal: 100% • Result: Wide variation in effectiveness

  27. Measures 1) estimation accuracy: effort 2) estimation accuracy: schedule 3) productivity 4) unit costs 5) project delivery rate (cycle time) 6) system test effectiveness 7) delivered defect density 8) customer satisfaction 9) requirements volatility

  28. Delivered Defect Density • Calculation: • Defects found in first 30 days of operations / function points • Goal: 0

  29. Delivered Defect Density COTS Custom Defects per function points 0 Function Points

  30. (Very Preliminary) Finding of Interest • In contrast to custom development, defect density for COTS projects appears unrelated to size

  31. Measures 1) estimation accuracy: effort 2) estimation accuracy: schedule 3) productivity 4) unit costs 5) project delivery rate (cycle time) 6) system test effectiveness 7) delivered defect density 8) customer satisfaction 9) requirements volatility

  32. Customer Satisfaction, Rqts Volatility • Data do not exist • Strategy was altered to request the manager’s estimate

  33. Message to Executive Level • Measurement • can be a powerful foundation for understanding and managing IT • is a cultural change and not a scoreboard • will improve as process maturity improves

  34. Response from Executive Level (CIO and direct reports) • Intense interest in the measures and in benchmarking • Basis for excellent discussions about need for visibility into • requirements management • quality • customer satisfaction • Collection of the nine measures has been made part of executive compensation • Moving forward to put supporting processes, tools and training in place

  35. To be continued...

More Related