1 / 21

Software Process and Project Metrics

Software Process and Project Metrics. Outline:. In the Software Metrics Domain: product metrics project metrics process metrics. Software Measurement size-oriented metrics function-oriented metrics. Metrics for Software Quality. Measure, Metrics, and Indicator.

burt
Download Presentation

Software Process and Project Metrics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Process and Project Metrics • Outline: • In the Software Metrics Domain: • product metrics • project metrics • process metrics • Software Measurement • size-oriented metrics • function-oriented metrics • Metrics for Software Quality November 2, 1997

  2. Measure, Metrics, and Indicator • Measure -- Provides a quantitative indication of the extent, amount, dimensions, capacity, or size of some product or process attribute. • Metrics -- A quantitative measure of the degree to which a system, component, or process possesses a given attribute. • Software Metrics -- refers to a broad range of measurements for computer software. • Indicator -- a metric or combination of metrics that provide insight into the software process, a software project, or the product itself. November 2, 1997

  3. In the Process and Project Domains • Process Indicator • enable insight into the efficacy of an existing process • to assess the current work status • Goal -- to lead to long-term software process improvement • Project Indicator • assess the status of an ongoing project • track potential risks • uncover problem areas before they “go critical” • evaluate the project team’s ability to control the product quality November 2, 1997

  4. Measurement • What to measure? • errors uncovered before release • defects delivered to and reported by end users • work products delivered • human effort expended • calendar time expended • schedule conformance • At what level of aggregation? • By team? • Individual? • Project? November 2, 1997

  5. Privacy Issues • Should they be used for personnel evaluation? • Some issues? • Privacy? • Is total assignment being measured? • Are the items being measured the same as for other individuals being measured? • Are the conditions of measurement the same across individuals? • However, they can be useful for individual improvement. November 2, 1997

  6. Use of Software Metrics • Use common sense and organizational sensitivity. • Provide regular feedback to individuals and teams. • Don’t use metrics to appraise individuals. • Set clear goal and metrics. • Never use metrics to threaten individuals or teams • Problems != negative. These data are merely an indicator for process improvement. • Don’t obsess on a single metric to the exclusion of other important metrics. • Do not rely on metrics to solve your problems. • Beware of people performing to metrics rather than product quality or safety. November 2, 1997

  7. Typical Causes of Product Defects November 2, 1997

  8. Project Metrics • Software Project Measures Are Tactical • used by a project manager and a software team • to adapt project work flow and technical activities • The Intent of Project Metrics Is Twofold • to minimize the development schedule to avoid delays and mitigate potential problems and risks • to assess project quality on an ongoing basis and modify the technical approach to improvement quality • Production Rates • pages of documentation • review hours • function points • delivered source lines • errors uncovered during SW engineering November 2, 1997

  9. Software Metrics • Direct measures • Cost and effort applied (in SEing process) • Lines of code(LOC) produced • Execution speed • CPU utilization • Memory size • Defects reported over certain period of time • Indirect Measures • Functionality, quality, complexity, efficiency, reliability, maintainability. November 2, 1997

  10. Software Measurement • Size-Oriented Metrics • are derived by normalizing quality and/or productivity measures by considering the “size” of the software that has been produced. • lines of code often as normalization value. project LOC effort $(000) pp.doc errors defects people alpha 12,100 24 168 365 134 29 3 beta 27,200 62 440 1224 321 86 5 gamma 20,200 43 314 1050 256 64 6 . . . . . . . . . . . . . . . . November 2, 1997

  11. Typical Size-Oriented Metrics • Errors per KLOC • Defects per KLOC • Dollars per KLOC • Pages of documentation per KLOC • Errors per person month • LOC per person month • Dollars per page of documentation November 2, 1997

  12. Software Measurement • Function-Oriented Metrics • use “functionality” to measure • derived from “function point” • using an empirical relationship • based on countable (direct) measure of SW information domain and assessments of software complexity • Use of Function-Oriented Metrics • Measuring scale of a project • Normalizing other metrics, e.g., $/FP, errors/FP November 2, 1997

  13. Function Point Calculation Weighting Factor measurement parameter count simple average complex number of user inputs * 3 4 6 = number of user outputs * 4 5 7 = # of user inquiries * 3 4 6 = number of files * 7 10 15 = # of external interfaces * 5 7 10 = count_total November 2, 1997

  14. Computing function points Rate each factor on a scale of 0 to 5 1 2 3 4 5 6 no influence incidental moderate average significant essential 1. does the system require reliable backup and recovery? 2. are data communications required? 3. are there distributed processing functions? 4. is performance critical? ........ 14. is the application designed to facilitate change and ease of use by the user? Function Point Calculation November 2, 1997

  15. Function-Oriented Metrics FP = count_total * [0.65 + 0.01 * sum of Fi] Outcome: errors per FP defects per FP $ per FP page of documentation per FP FP per person_month November 2, 1997

  16. Reconciling Different Metrics November 2, 1997

  17. Measures of Software Quality • Correctness • is the degree to which the software performs its required function. the most common measure for correctness is defects per KLOC • Maintainability • the ease that a program can be corrected • adapted if the environment changes • enhanced if the customer desires changes in requirements • based on the time-oriented measure mean time to change. November 2, 1997

  18. Measures of Software Quality (Cont’d) • Integrity • to measure a system’s ability to withstand attacks (both accidental and intentional) on its security threat and security are defined • integrity = sum [ 1 - threat * (1- security)] • Usability - an attempt to quantify “user friendliness” • physical/intellectual requirement to learn • time required to become moderately efficient • the net increase in productivity • user attitudes toward system November 2, 1997

  19. Defect Removal Efficiency • A Quality Metric That Provides Benefit at Both the Project and Process Level • DRE = E / ( E + D ) E = # of errors found before delivery of the software to the end user D = # of defects found after delivery • More generally, DREi = Ei / ( Ei + Ei+1 ) Ei = # of errors found during SE activity i November 2, 1997

  20. METRICS • CLCS Metrics Philosophy Phase 1: Provide a mandatory, nearly automated, metrics foundation to track lines of code and errors. Phase 2: Provide additional high-return metrics with recognized value. • Schedule metrics (milestones) • Additional S/W Problem metrics (actuals, trends, prediction) • Defect correction metrics • Run-time analysis metrics (McCabe tools, automated, COTS) Phase 3: Be driven to additional metrics only by absolute need. November 2, 1997

  21. METRICS November 2, 1997

More Related