1 / 60

Web: spr Email: Cpers.Jones3@gmail

Capers Jones & Associates LLC. SOFTWARE SIZING AND COST ESTIMATING IN 2011. Capers Jones, President Quality Seminar: talk 4. Web: http://www.spr.com Email: Cpers.Jones3@gmail.com. June 11, 2011. Technology. Software. Quality. Personnel. Processes. and. Productivity. Environment.

sjacqueline
Download Presentation

Web: spr Email: Cpers.Jones3@gmail

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Capers Jones & Associates LLC SOFTWARE SIZING AND COST ESTIMATING IN 2011 Capers Jones, President Quality Seminar: talk 4 Web: http://www.spr.com Email: Cpers.Jones3@gmail.com June 11, 2011

  2. Technology Software Quality Personnel Processes and Productivity Environment Four Key Areas That Influence Software Cost Estimates

  3. COMPARISON OF TWO MAJOR TOOL CLASSES • COST ESTIMATION PROJECT MANAGEMENT • Expert systems Calculation tools • (rule-based) (algorithm-based) • Embedded knowledge Requires expert users • Software oriented General purpose • Information systems Construction • Military systems Engineering • Embedded systems Manufacturing • Commercial systems Defense • Supports sizing No sizing support • Supports quality No quality support • Supports GANTT charts Supports PERT and CPM • at phase and activity levels nets down to worker level

  4. SYNERGY BETWEEN TOOL CLASSES • SOFTWARE COST ESTIMATING • Sizing of all project deliverables • Quality and reliability modeling • Tool and process modeling • Adjustments for programming languages. • PROJECT MANAGEMENT • Critical path analysis • Task-level scheduling • Individual job assignments • Progress monitoring • Cost accumulation

  5. HOW IMPORTANT IS PROJECT MANAGEMENT? • SUCCESSFUL CANCELED • ACTIVITYPROJECTSPROJECTS • Sizing Good Poor • Planning Very Good Very Poor • Estimating Very Good Very Poor • Milestone tracking Good Very Poor • Measurement Good Very Poor • Change control Excellent Poor • Quality Excellent Very Poor • Risk Analysis Good Very Poor • OverallVery GoodVery Poor

  6. = X ESTIMATING PRINCIPLES SIZING: Function points Source code Documents Defects Defect removal ATTRIBUTES: Project Constraints Team Technology Environment ESTIMATES: Staffing Schedules Effort Costs Quality

  7. THIRTY-FIVE FACTORS THAT INFLUENCE SOFTWARE Classification Factors 1. Project nature (new, enhancement, maintenance, etc.) 2. Project scope (module, object, program, system, class, etc.) 3. Project topology (stand-alone, client-server, distributed, etc.) 4. Project class (civilian, military, in-house, contract, etc.) 5. Project type (Batch, interactive, systems, embedded, etc.) 6. Hardware platform (mainframe, mini, PC, custom, etc.) 7. Software platform (Linux, Mac OS, MVS, Windows Vista, etc.)

  8. THIRTY-FIVE FACTORS THAT INFLUENCE SOFTWARE Project Factors 8. Size of the project (function points, LOC, deliverables) 9. Complexity of the project (cyclomatic, algorithmic, etc.) 10. Constraints of the project (schedule, security, etc.) 11. Novelty of the project (new, repeat, hybrid, etc.) 12. Value of the project (strategic, high, moderate, etc.) 13. Risks of the project (litigation, competition, markets, etc.) 14. Rate of creeping requirements (change % per month)

  9. THIRTY-FIVE FACTORS THAT INFLUENCE SOFTWARE Technology Factors 15. Any formal methodology used (Extreme, RUP, etc.) 16. Project management tools used for the application 17. Development tools used for the application 18. Defect prevention approaches used (JAD, QFD, TQM, etc.) 19. Defect removal operations and tools used (design reviews, code inspections, tests, etc.) 20. Programming language(s) used 21. Volume of reusable materials available and used

  10. THIRTY-FIVE FACTORS THAT INFLUENCE SOFTWARE Social and Ergonomic Factors 22. Experience and skill of the project managers 23. Experience and skill of the development team 24. Experience and cooperation of the clients 25. Organization and specialization applied to the project 26. Office space and office ergonomics 27. Targets or goals set for the project by clients or executives 28. SEI CMM capability level of development organization

  11. THIRTY-FIVE FACTORS THAT INFLUENCE SOFTWARE When international projects are considered, a significant new set of factors is added to the list that must be evaluated and dealt with. International Factors 29. Local laws or union regulations that affect software projects 30. Communication channels between clients, developers 31. Translations of screens, documents into multiple languages 32. Variations in public holidays and vacation periods 33. Variations in staff compensation levels in different countries 34. Variations in national work habits in different countries 35. Variations in currency exchange rates among countries

  12. LARGE SYSTEM SOFTWARE COST FACTORS • Applications > 10,000 function points: • Finding and fixing defects. • Producing paper documents. • Meetings and communication. • Coding or programming. • Project management. • Change control.

  13. SMALL PROGRAM SOFTWARE COST FACTORS • Applications < 1,000 function points: • Finding and fixing defects. • Coding or programming. • Project management • Producing paper documents. • Meetings and communication. • Change control.

  14. AGILE SOFTWARE COST FACTORS • Applications < 1,500 function points: • Finding and fixing defects. • Coding or programming. • Meetings and communications. • Project management • Change control. • Producing paper documents.

  15. SOFTWARE COST ESTIMATION TOOLS • TOOLYEAR AVAILABLE ORGANIZATION • PRICE-S 1973 RCA • SEER 1974 Hughes • SLIM 1979 Air Force • COCOMO 1981 TRW • SPQR/20 1985 SPR • CHECKPOINT - CHECKMARK 1989 SPR • KNOWLEDGEPLAN 1995 SPR • COCOMO II 1995 USC • ISBSG 2005 ISBSG • SOFTWARE RISK MASTER 2008 Capers Jones

  16. Software Function Point Metrics: 1979 to 2011 • In 1979 IBM put function point metrics in the public domain. • Function points are now key metrics for sizing and estimating. • Function points are weighted totals of five external factors: • Factor Number Weight TOTAL • Number of Inputs 10 X 5 = 50 • Number of Outputs 10 X 4 = 40 • Number of Inquiries 50 X 5 = 250 • Number of Logical files 5 X 10 = 50 • Number of Interfaces 10 X 7 = 70 • _____ • Unadjusted function points 460 • Complexity adjustment multiplier 1.2 • Adjusted function points 552

  17. KEY ESTIMATING DEFINITIONS ASSIGNMENT SCOPE Amount of work assigned to one person PRODUCTION RATE Amount of work accomplished in a standard time period OVERLAP The percent of a task not finished when a follow-on task begins SCHEDULE Calendar time required to complete a task REQUIREMENTS “CREEP” Growth rate in unplanned requirements after sign-off DEFECT REMOVAL EFFICIENCY Percentage of defects removed before delivery

  18. FUNDAMENTAL ESTIMATING EQUATIONS STAFF = SIZE / ASSIGNMENT SCOPE EFFORT = SIZE / PRODUCTION RATE SCHEDULE = EFFORT / STAFF COST = EFFORT * COMPENSATION BURDENED COST = COSTS + OVERHEAD DEFECTS = POTENTIAL - REMOVAL + BAD FIXES

  19. ESTIMATING EXAMPLES TOPIC ASSIGNMENT PRODUCTION SCOPE RATE Requirements 1500 FP 150 FP per month Design 1000 FP 75 FP per month Coding 15,000 LOC 2,500 LOC per month 150 FP 25 FP per month Testing 200 test cases 60 tests per month 1 test/FP Manuals 250 pages 50 pages per month .25 pages/FP Personnel 8 employees 40 FP per month

  20. ASSIGNMENT SCOPE EXAMPLES PROJECT = 10,000 function points (1,000,000 LOC) Requirements: 10,000 FP / 1,500 = 7 analysts Design: 10,000 FP / 1,000 = 10 designers Coding: 1,000,000 LOC / 15,000 = 66 programmers Testing 10,000 FP / 200 = 50 testers Manuals: 2,500 pages / 250 = 10 writers Management: 10,000 FP / 1,000 = 10 managers TOTAL 153 personnel

  21. PRODUCTION RATE EXAMPLES PROJECT = 1,000,000 LOC (10,000 Function points) Requirements: 10,000 FP / 150 = 67 staff months Design: 10,000 FP / 75 = 133 staff months Coding: 1,000,000 LOC / 2,500 = 400 staff months Testing 10,000 tests / 60 = 167 staff months Manuals: 2,500 pages / 50 = 50 staff months Management 10,000 FP / 40 = 250 staff months TOTAL 1,067 staff months

  22. SCHEDULING EXAMPLES PROJECT = 1,000,000 LOC (10,000 Function points) Requirements: 67 months / 7 = 9.5 calendar months Design: 133 months / 10 = 13.3 calendar months Coding: 400 months / 66 = 6.0 calendar months Testing 167 months / 50 = 3.3 calendar months Manuals: 50 months / 10 = 5.0 calendar months Management 250 months / 10 = 25.0 calendar months OVERALL SCHEDULE 36.5 calendar months OVERLAP 35% NET SCHEDULE 23.7 calendar months

  23. SCHEDULING OVERLAP EXAMPLE Requirements: Design: Coding: Testing : Manuals: Management: Schedule without overlap (waterfall model) = 36 months Schedule with overlap = 24 months

  24. COST VARIANCES AVERAGE SOFTWARE ENGINEER = $73,000 PER YEAR AVERAGE SOFTWARE MANAGER = $90,000 PER YEAR Variance by country = + or - 50% Variance by length of service = + or - 25% Variance by industry = + or - 20% Variance by company size = + or - 17% Variance by geographic region = + or - 15% Variance by occupation = + or - 13%

  25. COST VARIANCES FOR THE SAME POSITION Position Industry City Annual Salary Programmer Banking Geneva $85,000 Programmer Banking New York $75,000 Programmer Telecom Chicago $70,000 Programmer Defense St. Louis $60,000 Programmer Retail Tampa $55,000 Programmer Education Biloxi $50,000 Programmer Software Bombay $15,000 Programmer Defense Beijing $10,000

  26. REQUIREMENTS “CREEP” FOR SOFTWARE Software Class Monthly Change Maximum Creep Outsourced software 1.0% 75% Information systems 1.5% 125% Systems software 2.0% 150% Military software 2.0% 200% Civilian government 2.5% 200% Commercial software 3.5% 250% Requirements creep is a common source of schedule and cost overruns, and often leads to litigation.

  27. EXAMPLE OF SCHEDULE MACRO ESTIMATION • Raise the Function Point total of the project to the following power. The results will show schedules in calendar months. • CLASS OF SOFTWARE BEST AVERAGE WORST • Agile software 0.32 0.37 0.39 • OO software 0.33 0.38 0.43 • Extreme (XP) 0.33 0.39 0.43 • Outsource software 0.37 0.39 0.45 • MIS software 0.39 0.41 0.46 • Commercial software 0.39 0.43 0.47 • Systems software 0.43 0.44 0.48 • Military software 0.45 0.47 0.50

  28. 50 MANUAL VS. 50 AUTOMATED COST ESTIMATES • Manual estimation works well below 500 function points. • Automation is more accurate above 500 function points • Above 5000 function points manual estimates are dangerous. • Automated estimates usually accurate or conservative. • Manual estimates tend toward excessive optimism.

  29. >40% 25% 15% 5% 0 -5% -15% -25% <-40% Conservative 4 17 29 Optimistic ACCURACY RANGES FOR 50 MANUAL ESTIMATES • (Projects between 1000 and 10,000 Function Points) Manual estimates are usually very optimistic.

  30. >40% 25% 15% 5% 0 -5% -15% -25% <-40% Conservative 3 24 22 1 Optimistic ACCURACY RANGES FOR 50 AUTOMATED ESTIMATES (Projects between 1000 and 10,000 Function Points) Automated estimates are usually accurate or set to be conservative.

  31. LEVELS OF SOFTWARE ESTIMATION • Level of detailEstimate method • Project level Macro estimation • Phase level Macro estimation • Activity level Micro estimation • Task level Micro estimation • Sub-task level Micro estimation

  32. PROJECT, PHASE, AND ACTIVITY LEVELS • Project Level Phase Level Activity Level • Project 1. Requirements 1. Requirements 13. Configuration control • 2. Analysis 2. Prototyping 14. Integration • 3. Design 3. Architecture 15. User documentation • 4. Coding 4. Planning 16. Unit test • 5. Testing 5. Initial design 17. Function test • 6. Installation 6. Detail design 18. Integration test • 7. Design review 19. System test • 8. Coding 20. Field test • 9. Reused code acquisition 21. Acceptance test • 10. Package acquisition 22. Independent test • 11. Code inspection 23. Quality assurance • 12. Independent verif. & valid. 24. Installation 25. Management

  33. SOFTWARE COST ESTIMATION SEQUENCE • 1) Start with Sizing • Function Points • Specifications • Source Code • Test Cases • User Manuals • 2) Estimate Quality • Defect Potentials • Defect Removal • 3) Estimate Staffing • 4) Adjust for “Soft” Factors • Experience, tools, process • 5) Estimate Effort • 6) Estimate Schedules • 7) Estimate Costs

  34. MAJOR SIZING METHODS FOR SOFTWARE • Sizing by “patterns” from similar projects in similar industries. • Sizing from similar projects in your own enterprise • Sizing deliverables using lines of code (LOC) metrics. • Sizing deliverables using function point metrics. • Early approximate sizing from partial function point data . • Sizing from external attributes (business units, staff days)

  35. DOCUMENT PAGES PER FUNCTION POINT • Systems MIS Military Commercial • Software Software Software Software • User requirements 0.45 0.50 0.85 0.30 • Functional specifications 0.80 0.55 1.75 0.60 • Logic specifications 0.85 0.50 1.65 0.55 • Test plans 0.25 0.10 0.55 0.25 • User tutorial documents 0.30 0.15 0.50 0.85 • User reference documents 0.45 0.20 0.85 0.90 • Total document set 3.10 2.00 6.15 3.45

  36. CODE STATEMENTS PER FUNCTION POINT • (Range of Source Code Statements Required to Encode One Function Point) • Lowest Median Highest • Language Value Value Value • Ada83 60 71 80 • Assembly (macro) 130 213 300 • Basic (Interpreted) 63 98 135 • C 60 128 170 • C++ 30 90 145 • COBOL 65 107 160 • Fortran 75 105 157 • Pascal 50 91 125 • Java 35 52 87 • Program generators 10 16 20 • Smalltalk 12 21 28 • Note: Data available for > 600 programming languages and dialects.

  37. U.S. NORMS FOR DEFECT REMOVAL EFFICIENCY • (Defect potential data expressed in terms of Defects per Function Point) • Defect Defect Removal Delivered • Origins Potentials Efficiency Defects • Requirements 1.00 77% 0.23 • Design 1.25 85% 0.19 • Coding 1.75 95% 0.09 • Document 0.60 80% 0.12 • Bad Fixes 0.40 70% 0.12 • Total 5.00 85% 0.75

  38. IMPACTS OF FOUR KEY MANAGEMENT FACTORS • SIXTEEN PERMUTATIONS OF FOUR FACTORS: • MANUAL VERSUS AUTOMATED COST ESTIMATING. • MANUAL VERSUS AUTOMATED SCHEDULING. • FORMAL VERSUS INFORMAL MILESTONE TRACKING. • FORMAL VERSUS INFORMAL QUALITY CONTROL .

  39. PROJECT MANAGEMENT IMPACT • Cancel Delays On time Early • 1) Manual estimates 40% 45% 15% 0% • Manual plans • Informal tracking • Minimal quality control Worst-case Scenario Probability of Selected Outcomes NOTE: Assume a 10,000 function point system.

  40. PROJECT MANAGEMENT (cont.) Single-factor Scenario Probability of Selected Outcomes Cancel Delays On time Early 2) Manual estimates 37% 42% 20% 1% Automated plans Informal tracking Minimal quality control 3) Manual estimates 35% 39% 24% 2% Manual plans Formal tracking Minimal quality control 4) Automated estimates 33% 36% 28% 3% Manual plans Informal tracking Minimal quality control 5) Manual estimates 30% 32% 34% 4% Manual plans Informal tracking Optimal quality control

  41. PROJECT MANAGEMENT (cont.) Two-factor Scenario Probability of Selected Outcomes Cancel Delays On time Early 6) Manual estimates 27% 28% 40% 5% Automated plans Formal tracking Minimal quality control 7) Automated estimates 23% 26% 45% 6% Automated plans Informal tracking Minimal quality control 8) Automated estimates 20% 23% 50% 7% Manual plans Formal tracking Minimal quality control

  42. PROJECT MANAGEMENT (cont.) Two-factor Scenario Probability of Selected Outcomes Cancel Delays On time Early 9) Manual estimates 18% 20% 54% 8% Automated plans Informal tracking Optimal quality control 10) Manual estimates 16% 17% 58% 9% Manual plans Formal tracking Optimal quality control 11) Automated estimates 13% 15% 62% 10% Manual plans Informal tracking Optimal quality control

  43. PROJECT MANAGEMENT (cont.) Three-factor Scenario Probability of Selected Outcomes Cancel Delays On time Early 12) Automated estimates 10% 12% 67% 11% Automated plans Formal tracking Minimal quality control 13) Manual estimates 8% 10% 69% 13% Automated plans Formal tracking Optimal quality control 14) Automated estimates 5% 8% 72% 15% Manual plans Formal tracking Optimal quality control 15) Automated estimates 3% 6% 74% 17% Automated plans Manual tracking Optimal quality control

  44. PROJECT MANAGEMENT (cont.) Best-case Scenario Probability of Selected Outcomes Cancel Delays On time Early 16) Automated estimates 1% 2% 78% 19% Automated plans Formal tracking Optimal quality control • Good project management is the key to software success. • Bad project management leads to software failures.

  45. FUNCTION POINTS AND PROJECT MANAGEMENT TOOLS SELECTED TOOLS Lagging Average Leading 1 Project planning 1,000 1,500 3,000 2 Cost estimating -- 300 3,000 3 Project office -- -- 3,000 4 Statistical analysis -- 750 3,000 5 Personnel mgt. 500 1,000 2,000 6 Quality estimating -- -- 2,000 7 Process Assessment -- 500 2,000 8 Risk analysis -- -- 1,500 9 Value analysis -- 250 1,500 10 Department budgets 500 700 1,000 TOTALS 2,000 5,000 22,000 TOOLS USED 3 7 10

  46. STRUCTURES OF SOFTWARE PROJECTS • PROJECT CLASSES, TYPES, AND SIZES ARE DIFFERENT • Military specifications > 3 times larger than civilian specifications. • Military staffs > 30% larger for same size project than civilian. • Systems software testing costs > 20% more than MIS projects. • Systems software testing > 20% more effective than MIS testing. • Outsource project staffs >10% larger than in-house projects.

  47. MAJOR SOFTWARE CLASSES IN 2008 U. S. Software Production Military Software Information Systems System Software 26% 52% 22%

  48. MAJOR SOFTWARE FOCUS IN 2008 U. S. Software Development vs. Maintenance New Development Enhancements Maintenance 28% 42% 30%

  49. OVERALL U.S. SOFTWARE PRODUCTIVITY Figure 3-3

  50. DISTRIBUTION OF U.S. IT SOFTWARE PRODUCTIVITY Figure 3-4

More Related