1 / 44

Measurement and Estimating Models for Software Maintenance Workshop (COCOMO Forum)

Measurement and Estimating Models for Software Maintenance Workshop (COCOMO Forum) November 2011. Workshop Goals. Three goals for workshop: Expose members of the cost estimating community to recent Army software maintenance study findings and recommendations

Download Presentation

Measurement and Estimating Models for Software Maintenance Workshop (COCOMO Forum)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measurement and Estimating Models for Software Maintenance Workshop (COCOMO Forum) November 2011

  2. Workshop Goals • Three goals for workshop: • Expose members of the cost estimating community to recent Army software maintenance study findings and recommendations • Gather feedback from the community on our software maintenance WBS and initial influence factor analysis • Build consensus on what the most important factors are via Delphi survey • Build a community of interest in maintenance that supports development of new measures and models for improved software maintenance budgeting, estimation and management

  3. Workshop Agenda • Introductions • Summary of study findings and recommendations • Study current activities/next steps • WBS review • Influence factor review • Delphi survey • Roundtable discussions • Summary and break

  4. Background • The changing defense environment has placed a renewed emphasis on the performance of U.S. Army software maintenance, sustaining engineering, and operational support efforts • Accurate and objective cost estimates are required to ensure that sufficient resources are available to execute the work required to keep systems operational and mission capable • To develop accurate estimates, the Army has been working collaboratively with the Air Force and the Navy to collect and analyze past cost performance data and build a software maintenance cost database

  5. Goals of Army Maintenance Study • Goal - provide the Army with objective decision information to accurately estimate, budget, and allocate the software maintenance, sustaining engineering, and operational support resources (collectively referred to as software maintenance resources) required to meet evolving mission and service affordability requirements • Three years of effort to date focusing on the work done and issues experienced by Army and Air Force life cycle support centers • Findings to-date seem universal across the service weapons systems community

  6. Information Requirements • Accurate estimates of software maintenance resources mainly for weapons systems from a: • Product Perspective • Organizational Perspective • Enterprise Perspective • Project and Release Perspective • Objective portrayal of the dynamic PDSS/PPSS maintenance environment • Consistent with work being performed

  7. Information Requirements • Consistent description of software maintenance tasks • Software Maintenance • Software Sustaining Engineering • Software Support Infrastructure – Facilities • Program/Project Management • Quantitative understanding of the key performance factors that influence software maintenance resource estimation

  8. Study Approach • Direct interface with DoD and Industry software support organizations performing software maintenance tasks under contract and at DOD Life Cycle Support Centers: • Task identification - definition - allocations • Historical cost data collection - budget submissions and actuals • Organizational context data - performance factors • Stakeholder collaboration - protected sharing of data, information, findings: • Air Force, Army, Navy, DoD Agencies and potentially allies • Industry, Academia and Professional Groups

  9. Study Approach • Develop approaches and mechanisms to capture cost data and context information architectures and data stores • Perform data modeling - analysis: • Parametric cost model calibration - CER development • Performance “meta” model - factor relationships • Policy and decision information analysis • Make recommendations for improvements and increased affordability • Develop software with maintenance in mind • Smooth the transition from development to maintenance, sustaining engineering, and operational support

  10. Army Study Approach • Investigate maintenance • What are the tasks? • Who does them? • What are the costs? • How they are estimated? • What impacts future costs? • Understand • Current costs and risks • Current estimating practice • Current budgeting approach • What changes are needed

  11. Expected Outputs • Context-driven software maintenance performance model - configurable to product, organization, and enterprise activities and scope • Calibrated cost models for maintenance and cost estimating relationships for ACAT 1A systems • Multiple task categories • Domain specific with validated cost relationships • Software maintenance information architecture - common service database • Changes to policy and budgetary guidance needed to support systems transitions and workload growth • Software maintenance reporting requirements like the SRDR • More accurate estimates for the POM

  12. Performance Meta-Modeling Who – Contractor Or Government? Complexity No. of Releases Done in Parallel Funding Approach Domain Constraints and Influence Factors • Project • Factors • Process • People • Product • Work Model • Maintenance workload • Sustaining engineering tasks • Infrastructure tasks • Program/Project Mgmt tasks Color of Money

  13. Summary of Study Findings • Distribution of work much different than expected • Testing is the major maintenance activity • Transition and transfer is done poorly • Estimates and budgets don’t cover all the work • Sustaining engineering • Product field & user support • Regression testing • Efficiencies are needed to cope with workload • Over 250 projects surveyed • Eight Army and AF Centers visited • Over 100 interviews • Industry consulted • Findings • Maintenance centers do more than just updates and repairs • Products • Reports, papers, briefings, etc. • Web site • Initial maintenance cost and quality database

  14. Army Study Findings • Maintenance is done differently by contractors than government shops • Contractors develop to requirements, government supports testing and field support • As a minimum, four software releases are being prepared by maintenance shops in parallel during the calendar year • Development release - Fielded release • To be fielded release - Requirements release • Work for software maintenance differs from development – more test-directed and constrained by environment

  15. Current Work Distribution Notes • About seventy percent of their work involves: • Maintenance • Sustaining Engineering • Independent V&V • The other thirty percent is devoted to other tasks: • Acquisition management • Software development • Maintenance staff includes both government and in-house contractor personnel

  16. Testing is Primary Maintenance Activity • As much as 55-70% of the technical work done during maintenance supports retesting and qualifying the system • Testing is much harder when developers fail to transition and turnover the needed set of regression tests • Support tasks are performed to maintain system integrity and support field operations 10% 20% 55%

  17. Not All Of The Work Is Funded • Estimates formulated based on effort needed to make updates and repairs • Other activities like sustaining engineering and testing not fully covered • Unfunded mandates like Info Assurance not adequately covered • Small projects done on LOE basis • Licenses may need to be funded by enterprise • Resulting budgets force maintenance staff to play backlog reduction games • Shops make the updates and repairs that they can with resources allocated • Cost models & cost estimating relationships (CERs) used perpetuate status quo • Shortfalls in funding need to be corrected

  18. Development System Not Ready for Maintenance • Transition requirements often waived, avoided or delayed • Facilities, tactical equipment and tools often not available when needed • Ownership rights to tools and special test equipment often an issue • Development SIL seldom transitioned for maintenance • Many aspects of “technical debt” are not addressed • Contractor often the only resource available to maintain system

  19. On-Going Tasks • Data collection • Questionnaire/instruments • Maintenance cost and quality database • Data administration, protection and management • Architecture development • Information needs • Data modeling • Analysis • Gap analysis • Indicators • Stakeholder Program • “Working one-on-one” • Web site • Case studies • Outreach • Collaborators • Conferences • Presentations • Publications • Working groups • Project management • Status and progress reviews

  20. Going Forward • Characterize software maintenance, sustaining engineering, and operational support • Understand commonalities and differences among services, domains, programs, and maintenance organizations • Clarify the differences between perceptions and realities • Coordinate the efforts of the Services and industry to collect relevant information • Understand the characteristics of post deployment software activities

  21. Collaborative Working Group • Develop working group with government and industry to explore the identified issues and provide recommendations • Requires collecting data to support the findings • Data collection must be done hands-on to reduce noise and increase confidence in results • Develop viable software maintenance cost estimation methods, models, and practices • Work with academia and the DAU to improve the education of the workforce on the realities of software maintenance, sustainment, and operational support

  22. In Summary • We invite you to participate in our joint efforts • We ask you to help us populate our software maintenance cost database • We are interested in any success stories you would like to share

  23. Workshop Agenda • Introductions • Summary of study findings and recommendations • Study current activities/next steps • WBS review • Influence factor review • Delphi survey • Roundtable discussions • Summary and break

  24. WBS Revision • In order to understand the factors that impact maintenance cost, we are developing a performance meta-model • When discussing the meta-model, we will summarize tasks around four major activities: • Software maintenance • Software sustaining engineering • Software support infrastructure & facilities • Program/project management

  25. WBS Revision Contractor • Typically mimic software development contracts • Often require delivery to government sites that handle distribution, certify the software, perform field support and perform test and evaluation Government • Perform maintenance with real operational equipment and boots on the ground • Maintenance is only part of the tasks they perform (often fix hardware, do acquisition support, etc.)

  26. WBS Revision Software Maintenance: 1.1 Software maintenance 1.1.1 Release requirements 1.1.2 Release planning 1.1.3 Architecture analysis 1.1.4 Hardware defect repair 1.1.5 Software defect repair 1.1.6 Hardware enhancements 1.1.7 Software enhancements 1.1.8 Release integration & test 1.1.9 Release qualification & delivery

  27. WBS Revision 1.3 Independent test and verification 1.3.1 Test planning 1.3.2 Test preparation 1.3.3 Test conduct 1.3.4 Independent analysis & verification 1.3.5 Certifications 1.5 Information assurance 1.5.1 Protection services 1.5.2 DIACAP 1.5.3 IAVA

  28. WBS Revision Software Sustaining Engineering 1.2 Sustaining engineering 1.2.1 Analysis and studies 1.2.2 Emergency repairs 1.2.3 User training 1.2.4 External support 1.6 Acquisition support 1.7 Operational support 1.9 Field support

  29. WBS Revision Software Support Infrastructure & Facilities 1.4 Product support 1.4.1 Configuration management 1.4.2 Quality assurance 1.4.3 Process management (peer reviews) 1.4.4 Supplier management 1.4.5 Security 1.8 Facility support 1.8.1 Maintenance facility sustainment 1.8.2 SIL sustainment 1.8.3 Equipment sustainment 1.8.4 Specialized test equipment and tools 1.8.5 Network operations and administration

  30. WBS Revision 1.11 Parts 1.12 Spares 1.13 Licenses Program/Project Management 1.10 Management 1.10.1 Release management 1.10.2 Sustaining engineering management 1.10.3 Risk management 1.10.4 Measurement analysis 1.14 Contractual capabilities set FY (XX/XX) 1.15 Contractual system mission capability 1.16 Cost item general

  31. Workshop Agenda • Introductions • Summary of study findings and recommendations • Study current activities/next steps • WBS review • Influence factor review • Delphi survey • Roundtable discussions • Summary and break

  32. Influence Factors • Business Factors • Extent of policy coverage - governance • Low level technical & business decision autonomy • Diverse organizational task and activity portfolios • Program and domain characteristics • Product and data rights • Source and color of money • Resourcing business models • Estimation/budgeting approaches • Information system capabilities

  33. Influence Factors • Complexity factors • Legacy software architectures • Legacy software technologies • Backfit security requirements • Backfit safety and other certification requirements • System of system integration requirements

  34. Influence Factors • Resource and task alignment factors • Policies, budgets, resources, tasks and outputs • Autonomous personnel and funding decisions • Plan versus execution • Alignment of task and funding models • Management reserves (reduced allocations) • Top-level expectations versus realities • Overhead versus direct funded functions • Amount of “technical debt”

  35. Influence Factors • Execution factors • Event-driven requirements and reprioritizations • Short-term mission driven execution schedules • Multiple customers – direct user involvement • Multiple funding streams • No. of releases being done in parallel (using same resources) • Backlog at start of release • Uncertainty of planning parameters • Organizational capability – flexibility - stability

  36. Size Drivers Release • No. of change requests • No. of repairs • Defects by priority and type • No. of patches • Backlog (technical debt) • Defect by priority, age and type • No. of COTS packages updated Enterprise • No. of programs/ projects competing for resources • No. of releases being supported by program/project • Stability of releases over time as measured by change rate • Stability of core funding

  37. Maintenance Effort Multipliers • People • Analyst capability • Programmer capability • Personnel continuity • Applications experience • Platform experience • Language and tool experience • Product • Platform/domain type • Application type • Language type • Requirements volatility (change) • Product complexity • Data base size • Required reliability • Degree of reuse • Documentation match to needs • Execution time constrain • Main storage constrain • Platform volatility

  38. Effort Multiplers • Project • Degree of precedentedness • Development flexibility • Architecture/risk resolution • Team cohesion • Use of software tools • Multi-site capability • Required development schedule • Resource dedication • Process • Acquisition method • Development method • Development standard • Use of modern programming practices • Process maturity • Process volatility (change) • Other

  39. Effort Multipliers • Because of time limitations, we will not look at size and effort multipliers for: • Software sustaining engineering • Software support infrastructure and facilities • Program/project management • We will address these at next year either by email or at another conference like PSM

  40. Delphi Survey • Goal – Determine what factors that you believe have greatest impact on software maintenance projects • Scope: • ID tasks which your maintenance shop performs • ID factors to which their effort is most sensitive • Schedule not considered because it is fixed in maintenance • Size of project considered because many maintenance projects are small • Can view influence factors as constraints imposed on either enterprise or project

  41. Workshop Agenda • Introductions • Summary of study findings and recommendations • Study current activities/next steps • WBS review • Influence factor review • Delphi survey • Roundtable discussions • Summary and break

  42. Roundtable Discussion • What did you think were the three biggest influence factors? • What do you believe are the three drivers that the maintenance effort is most sensitive to?

  43. Workshop Agenda • Introductions • Summary of study findings and recommendations • Study current activities/next steps • WBS review • Influence factor review • Delphi survey • Roundtable discussions • Summary and break

  44. In Summary • We have summarized the results of our study • We have reviewed the maintenance WBS that we have developed • We have conducted a Delphi to identify the influence factors and cost drivers important to estimating • We will summarize findings and present them on Friday • If you wish to be on our distribution, let us know This is why Don lives in Prescott, AZ In

More Related