1 / 38

Software Sizing

Software Sizing. Ali Afzal Malik & Vu Nguyen USC-CSSE CSCI 510. Outline. Nature, value, and challenges of sizing Sizing methods and tools Sizing metrics Maintenance sizing Conclusions and references. Nature and Value of Sizing. Definitions and general characteristics

ion
Download Presentation

Software Sizing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Sizing Ali Afzal Malik & Vu Nguyen USC-CSSE CSCI 510 ©USC-CSSE

  2. Outline • Nature, value, and challenges of sizing • Sizing methods and tools • Sizing metrics • Maintenance sizing • Conclusions and references ©USC-CSSE

  3. Nature and Value of Sizing • Definitions and general characteristics • Value provided by sizing • When does sizing add less value? ©USC-CSSE

  4. Size: Bigness/Bulk/Magnitude Measured in gallons Measured in usable square feet Measured in ??? ©USC-CSSE

  5. Sizing Characteristics • Generally considered additive • Size (A U B) = Size (A) + Size (B) • Often weighted by complexity • Some artifacts “heavier” to put in place • Requirements, function points ©USC-CSSE

  6. Software Size is an Important Metric • A key metric used to determine • software project effort and cost: (effort, cost) = f(size, factors) • time to develop • staff • quality • productivity ©USC-CSSE

  7. When Does Sizing Add Less Value? • Often easier to go directly to estimating effort • Imprecise size parameters • GUI builders; COTS integration • Familiar, similar-size applications • Analogy to previous effort: “yesterday’s weather” • When size is a dependent variable • Time-boxing with prioritized features ©USC-CSSE

  8. Sizing Challenges • Brooks’ factors: software system, product • Cone of uncertainty ©USC-CSSE

  9. Brooks’ Factor of 9 for Programming System Product x3 A Programming System A Program x3 A Programming Systems Product A Programming Product Adapted from Brooks, 1995 Product: Handles off-nominal cases, testing; well-documentedSystem: Handles interoperability, data management, business managementTelecom: 1/6 basic call processing; 1/3 off-nominals; 1/2 billing ©USC-CSSE

  10. The Cost and Size Cone of Uncertainty • If you don’t know what you’re building, it’s hard to estimate its size or cost Boehm et al. 2000 ©USC-CSSE

  11. Outline • Nature, value, and challenges of sizing • Sizing methods and tools • Sizing metrics • Maintenance sizing • Conclusions and references ©USC-CSSE

  12. Basic Methods, Strengths, and Weaknesses(Adapted from Boehm, 1981) ©USC-CSSE

  13. Comparison with Previous Projects • Comparable metadata: domain, user base, platform, etc. • Pair-wise comparisons • Differential functionality • analogy; yesterday’s weather ©USC-CSSE

  14. Group Consensus • Wideband Delphi • Anonymous estimates • Facilitator provides summary • Estimators discuss results and rationale • Iterative process • Estimates converge after revision in next rounds • Improves understanding of the product’s nature and scope • Works when estimators are collocated • Planning Poker (Cohn, 2005) • Game: deck of cards • Moderator provides the estimation-item • Participants privately choose appropriate card from deck • Divergence is discussed • Iterative – convergence in subsequent rounds • Ensures everyone participates • Useful for estimation in agile projects ©USC-CSSE

  15. Probabilistic Methods • PERT (Putnam and Fitzsimmons, 1979) • 3 estimates: optimistic, most likely, pessimistic • Expected Size = optimistic + 4 * (most likely) + pessimistic • Std. Dev. = (pessimistic – optimistic) / 6 • Easy to use • Ratio reveals uncertainty • Example 6 ©USC-CSSE

  16. Why Do People Underestimate Size?(Boehm, 1981) • Basically optimistic and desire to please • Have incomplete recall of previous experience • Generally not familiar with the entire software job ©USC-CSSE

  17. Outline • Nature, value, and challenges of sizing • Sizing methods and tools • Sizing metrics • Maintenance sizing • Conclusions and references ©USC-CSSE

  18. Sizing Metrics vs. Time and Degree of Detail(Stutzke, 2005) Process Phase Concept Elaboration Construction Screens, Reports, Files, Application Points Source Lines of Code, Logical Statements Subsystems Key Features User Roles, Use Cases Function Points Components Objects Possible Measures Operational Concept, Context Diagram Architecture, Top Level Design Product Vision, Analogies Detailed Design Primary Aids Specification, Feature List Code Increasing Time and Detail ©USC-CSSE

  19. Sizing Metrics and Methodologies Function Points COSMIC Function Points Use Case Points Mark II Function Points Requirements CK Object Oriented Metrics Class Points Object Points Architecture and Design Source Lines of Code (SLOC) Module (file) Count Code ©USC-CSSE

  20. Counting Artifacts • Artifacts: requirements, inputs, outputs, classes, use cases, modules, scenarios • Often weighted by relative difficulty • Easy to count at initial stage • Estimates may differ based on level of detail ©USC-CSSE

  21. Cockburn Use Case Level of Detail Scale(Cockburn, 2001) ©USC-CSSE

  22. Function Point Analysis - 1 • FPA measures the software size by quantifying functionality of the software provided to users • Five function types • Data • Internal Logical File (ILF), External Interface File (EIF) • Transactions • External Input (EI), External Output (EO), External Query (EQ) • Each function is weighted by its complexity • Total FP is then adjusted using 14 general systems characteristics (GSC) MS Excel MS Word (system being counted) EO/EO/EQ Human User EI/EO/EQ EIF ILF MS Outlook EO/EO/EQ EIF ©USC-CSSE

  23. Function Point Analysis - 2 • Advantages • Independent of technology and implementation • Can be measured early in project lifecycle • Can be applied consistently across projects and organizations • Disadvantages • Difficult to automate • Require technical experience and estimation skills • Difficult to count size of maintenance projects ©USC-CSSE

  24. COSMIC FP • COSMIC measures functionality of the software thru the movement of data (exit, entry, read, write) cross logical layers • Advantages • Same as FPA • Applicable to real-time systems • Disadvantages • Difficult to automate • Require experience and estimation skills • Require detailed architecture • Difficult to understand for customers ©USC-CSSE

  25. Use Case Point • UCP counts functional size of the software by measuring the complexity of use cases • Advantages • Independent of technology and implementation • Can be measured early in project lifecycle • Easier to count than FPA • Disadvantages • Lack of standardization • Difficult to automate • High variance (low accuracy) • UCP = (#Actors + #Use cases) * technical and environmental factors ©USC-CSSE

  26. CK Object-Oriented Metrics • Six OO metrics (Chidamber and Kemerer, 1994) • Weighted Methods Per Class (WMC) • Depth of Inheritance Tree (DIT) • Number of Children (NOC) • Coupling Between Object Classes (CBO) • Response for a Class (RFC) • Lack of Cohesion in Methods (LCOM) • Advantages • Correlated to effort • Can be automated using tools • Disadvantages • Difficult to estimate early • Useful only in object-oriented code ©USC-CSSE

  27. Source Lines of Code (SLOC) • SLOC measures software size or length by counting the logical or physical number of source code lines • Physical SLOC = number of physical lines in code • Logical SLOC = number of source statements • Advantages • Can be automated easily using tools (e.g., USC’s CodeCount) • Easy to understand for customers and developers • Reflect the developer’s view of software • Supported by most of cost estimation models • Disadvantages • Dependent on programming languages, programming styles • Difficult to estimate early • Inconsistent definitions across languages and organizations ©USC-CSSE

  28. SLOC Counting Rules • Standard definition for counting lines - Based on SEI definition checklist from CMU/SEI-92-TR-20 - Modified for COCOMO II Adapted from Madachy, 2005 ©USC-CSSE

  29. Relationship Among Sizing Metrics • Two broad categories of sizing metrics • Implementation-specific e.g. Source Lines of Code (SLOC) • Implementation-independent e.g. Function Points (FP) • Need to relate the two categories • e.g. SLOC/FP backfire ratios ©USC-CSSE

  30. SLOC/FP Backfiring Table(Jones, 1996): other backfire ratios up to 60% higher ©USC-CSSE

  31. Multisource EstimationImplementation-independent/dependent • Implementation-independent estimators • Use cases, function points, requirements • Advantage: implementation-independent • Good for productivity comparisons when using VHLLs, COTS, reusable components • Weakness: implementation-independent • Gives same estimate when using VHLLs, COTS, reusable components, 3GL development • Multisource estimation reduces risk ©USC-CSSE

  32. Outline • Nature, value, and challenges of sizing • Sizing methods and tools • Sizing metrics • Maintenance sizing • Conclusions and references ©USC-CSSE

  33. Reused Code Adapted Code New Code Sizing in Reuse and Maintenance Release 1 Release 2 Acquired Code Project Execution Exiting System’s Code ©USC-CSSE

  34. Calculating Total Equivalent SLOC - 1 • Formulas: DM = % design modified, CM = % code modified, IM = % integration and test needed ©USC-CSSE

  35. Calculating Total Equivalent SLOC - 2 ©USC-CSSE

  36. Adaptation Adjustment Modifier (AAM) Relative Cost Relative Amount of Modification (AAF) ©USC-CSSE

  37. Conclusions • Size plays an important role in estimation and project management activities • Software estimation is a “garbage in garbage out” process • Bad size in; bad cost out • A number of challenges make estimation of size difficult • Different sizing metrics • FP, COSMIC FP (cfu), Use Case Points, SLOC, CK OO metrics • A plethora of sizing methods exists each method has strengths and weaknesses ©USC-CSSE

  38. References • Books • Boehm, Barry W., Software Engineering Economics, Prentice Hall, 1981. • Boehm, Barry W., et al., Software Cost Estimation With COCOMO II, Prentice Hall, 2000. • Brooks, Jr., Frederick P., The Mythical Man-Month, Addison-Wesley, 1995. • Cockburn, A., Writing Effective Use Cases, Addison-Wesley, 2001. • Cohn, M., Agile Estimating and Planning, Prentice Hall, 2005. • Jones, C., Applied Software Measurement: Assuring Productivity and Quality, McGraw-Hill, 1996. • Stutzke, Richard D., Estimating Software-Intensive Systems, Addison-Wesley, 2005. • Journals • Chidamber, S. and C. Kemerer, 1994, “A Metrics Suite for Object Oriented Design,” IEEE Transactions on Software Engineering. • Putnam, L.H., and Fitzsimmons, A., 1979. Estimating software costs. Datamation. • Tutorials/Lectures/Presentations • Boehm, Barry W., and Malik, Ali A., COCOMO Forum Tutorial, “System & Software Sizing”, 2007. • Boehm, Barry W., CSCI 577a Lecture, “Cost Estimation with COCOMO II”, 2004. • Madachy, Ray, CSCI 510 Lecture, “COCOMO II Overview”, 2005. • Valerdi, Ricardo, INCOSE Presentation, “The Constructive Systems Engineering Cost Model”, 2005. • Websites • http://www.crisp.se/planningpoker/ • http://www.ifpug.org/ • http://www.spr.com/ ©USC-CSSE

More Related