1 / 22

Reducing Estimation Uncertainty with Continuous Assessment: Tracking the “ Cone of Uncertainty ”

Reducing Estimation Uncertainty with Continuous Assessment: Tracking the “ Cone of Uncertainty ”. Pongtip Aroonvatanaporn, Chatchai Sinthop, Barry Boehm {aroonvat, sinthop, boehm} @usc.edu November 2, 2010. Outline. Introduction and Motivation Framework Model Experiment Results

aulani
Download Presentation

Reducing Estimation Uncertainty with Continuous Assessment: Tracking the “ Cone of Uncertainty ”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Reducing Estimation Uncertainty with Continuous Assessment:Tracking the “Cone of Uncertainty” Pongtip Aroonvatanaporn, Chatchai Sinthop, Barry Boehm {aroonvat, sinthop, boehm} @usc.edu November 2, 2010

  2. Outline • Introduction and Motivation • Framework Model • Experiment • Results • Conclusion and future work © USC-CSSE

  3. The Cone of Uncertainty Also applies to project estimation accuracy Inexperienced teams Experienced teams © USC-CSSE

  4. Definition • Inexperience • Inexperienced in general • Experienced, but in a new domain • Anything that is new with little knowledge or experience © USC-CSSE

  5. The Problem • Experienced teams can produce better estimates • Use “yesterday’s weather” • Past projects of comparable size • Past data of team’s productivity • Knowledge of accumulated problems and solutions • Inexperienced teams do not have this luxury No tools or data that monitors project’s progression within the cone of uncertainty © USC-CSSE

  6. The Problem • Imprecise project scoping • Overestimate vs. underestimate • Manual assessments are tedious • Complex and discouraging • Project estimation not revisited • Insufficient data to perform predictions • Project’s uncertainties not adjusted • Limitations in software cost estimation • Models cannot fully compensate for lack of knowledge and understanding © USC-CSSE

  7. The Goal • Develop a framework to address mentioned issues • Help unprecedented projects track project progression • Reduce the uncertainties in estimation • Achieve eventual convergence of estimate and actual Must be quick and easy to use © USC-CSSE

  8. Benefits • Improve project planning and management • Resources and goals • Improved product quality control • Actual project progress tracking • Better understanding of project status • Actual progress reports © USC-CSSE

  9. Outline • Introduction and Motivation • Framework Model • Experiment • Results • Conclusion and future work © USC-CSSE

  10. Estimation Model Integration of the Unified Code Count tool and COCOMO II estimation model Adjusted with REVL © USC-CSSE

  11. Outline • Introduction and Motivation • Framework Model • Experiment • Results • Conclusion and future work © USC-CSSE

  12. Experiment Setup • Performed simulation on 2 projects from USC software engineering course • Project similarities • Real-client • SAIV: 24-weeks • Architected agile process, 8-member team • Size, type, and complexities • Product • E-services • Web content management system • JSP, MySQL, Tomcat © USC-CSSE

  13. Obtaining Data • Source code files retrieved from Subversion server • Simulation of assessment done weekly • Both teams were closely involved • Provide estimation of module completion • Rationale © USC-CSSE

  14. Outline • Introduction and Motivation • Framework Model • Experiment • Results • Conclusion and future work © USC-CSSE

  15. Results ~18% Initial estimate ~50% Initial estimate Adjusted estimate Adjusted estimate Accumulated effort Accumulated effort © USC-CSSE

  16. Results • Project progress reaches 100% • Reflects reality • Estimation errors reduced to 0% © USC-CSSE

  17. Outline • Introduction and Motivation • Framework Model • Experiment • Results • Conclusion and future work © USC-CSSE

  18. Conclusion • Both teams demonstrated the same phenomenon • Gaps in estimation errors decrease • Representation of “cone of uncertainty” • Estimation framework reflects the reality of project’s progress • Assessment process was quick and simple • Requires few inputs • Little analysis needed • Assessment framework help inexperienced team improve project tracking and estimation © USC-CSSE

  19. Future Work • Tool development currently in progress • Determine the frequencies of assessments required • The sweet spot • Observe prediction accuracies • Experiment on projects of larger scale • Experiment on projects of different types • Use concept of value-based • Apply weights to calculation of each software module based on priorities and criticalities • How to adjust COCOMO parameters © USC-CSSE

  20. References • Boehm, B., Abts, C., Brown, A. W., Chulani, S., Clark, B. K., Horowitz, E., Madachy, R., Reifer, D. J., and Steece, B. Software Cost Estimation with COCOMO II, Prentice-Hall, 2000. • Cohn, M. Agile Estimating and Planning, Prentice-Hall, 2005 • DeMarco, T. Controlling Software Projects: Management, Measurement, and Estimation, Yourdon Press, 1982 • Fleming, Q. W. and Koppelman, J. M. Earned Value Project Management, 2nd edition, Project Management Institute, 2000 • Jorgensen, M. and Boehm, B. “Software Development Effort Estimation: Formal Models or Expert Judgment?”IEEE Software, March-April 2009, pp. 14-19 • Nguyen, V., Deeds-Rubin, S., Tan, T., and Boehm, B. "A SLOC Counting Standard," COCOMO II Forum 2007 • Stutzke, R. D. Estimating Software-Intensive Systems, Pearson Education, Inc, 2005. © USC-CSSE

  21. Backup Slides © USC-CSSE

  22. Related Work • Software estimation methods • Estimating Software-Intensive Systems [Stutzke, 2005] • Expert-judgement vs. parametric-model [Jorgensen, 2007] • Agile estimation [Cohn, 2005] • Software estimation uncertainty • PERT sizing methods [Nguyen, 2007] • Wideband Delphi estimate distributions [Boehm, 2000] • Software project tracking methods • Controlling Software Projects [DeMarco, 1982] • Earned Value Management [Fleming, 2000] © USC-CSSE

More Related