1 / 20

Metrics

Metrics. "A science is as mature as its measurement tools." -- Louis Pasteur. Starter Questions. What can we measure? What value can those numbers have?. Why Measure?. accurate estimation how productive are we quality improvement what do we do well

clemance
Download Presentation

Metrics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Metrics "A science is as mature as its measurement tools." -- Louis Pasteur

  2. Starter Questions • What can we measure? • What value can those numbers have?

  3. Why Measure? • accurate estimation • how productive are we • quality improvement • what do we do well • what do we do poorly

  4. Types of Metrics • Product Metrics • direct measures - number of bugs, LOC • indirect measures - usability, maintainability • Project and Process Metrics • direct measures - costs, LOC per month • indirect measures - quality assurance, reliability

  5. Code Metrics • Size • Lines of Code • Function Points • Efficiency • BigO • Complexity • Cyclomatic Complexity • Halstead's complexity metrics (next slide) • Maintainability

  6. Halstead's Complexity Metrics n1 = the number of distinct operators n2 = the number of distinct operands N1 = the total number of operators N2 = the total number of operands Program length N = N1 + N2 Program vocabulary n = n1 + n2 Volume V = N * (LOG2 n) Difficulty D = (n1 / 2) * (N2 / n2) Effort E = D * V

  7. McCall's Quality Factors Portability Reusability Interoperability Maintainability Flexibility Testability Product Revision Product Transition Product Operations Correctness Reliability Efficiency Integrity Usability

  8. Operability Training Communicativeness Input/Output volume Input/Output gate Access Control Access Audit Storage efficiency Execution Efficiency Traceability Completeness Accuracy Error Tolerance Consistency Simplicity Conciseness Instrumentation Expandability Generality Self-Descriptiveness Modularity Machine Independence Software System Independence Communications Commonality Data Commonality Usability Integrity Efficiency Correctness Reliability Maintainability Testability Flexibility Reusability Portability Interoperability

  9. ISO 9126 Quality Characteristics and Guidelines for Their Use Quality Factors • Functionality • Reliability • Usability • Efficiency • Maintainability • Portability

  10. Design Metrics • Fan In • Fan Out • Morphology • based on number of nodes, depth, width

  11. Module Design Metrics • Cohesion • how many functions does a module perform • coincidental • logical - eg does all output • temporal - eg all the startup work • procedural - executed in this order • communicational - module arrangement on work on what data • functional • Coupling • how is the module connected to other modules • global variable, parameters, stands alone

  12. Object-Oriented Metrics • Weighted Methods per Class • not only how many methods per class are there, but also how complex are they • Depth of Inheritance Tree • Number of Children • how many child classes does a class have • Response for Class • number of local methods, plus number of methods they call • Lack of Cohesion Metric • number of non-intersecting (don't use the same variables) methods

  13. Lack of Cohesion Example • Module A calls Module B • B accesses Variable X • C and D access Y • D calls E • This should be split into two classes. A C D B E Y X

  14. Project Metrics • LOC or FP per month • Errors per LOC (aka Defect Density) • Defect Removal Efficiency • Time required to make changes • Test coverage • Required Skills

  15. Obviously,Beware of Statistics Version 2 Version 1 Module# Defects LOC Defects/LOC Defects LOC Defects/LOC Relation 1 12 777 0.01544 3 55 0.05455 < 2 5 110 0.04545 6 110 0.05455 < 3 2 110 0.01818 3 110 0.02727 < 4 3 110 0.02727 4 110 0.03636 < 5 6 1000 0.00600 70 10000 0.00700 < Sum 28 2107 0.01329 86 10385 0.00828 > http://irb.cs.tu-berlin.de/~zuse/sme.html

  16. Basic Questions • What are the basic metrics that managers need to track? • How do we gather all these numbers? • When do we process all these numbers?

  17. SEI CMM Level 2 (repeatable) • Software Requirements Management • status of allocated requirements • number of changes to requirements • Software Project Planning • completion of milestones compared to the plan • work completed, funds expended, … compared to plan • Software Project Tracking and Oversight • resources expended to conduct oversight

  18. SEI CMM Level 3 (defined) • Training Program • number of training waivers approved • actual attendance vs projected attendance • results on post-training tests • Software Product Engineering • numbers, types, and severity of defects by stage • effort to analyze proposed changes • number of changes by category

  19. Summary To do something well, we must understand what we are doing. To understand something, we must be able to measure it. We can measure what we are building and we can measure our building process.

  20. Next Topics… • Managing the testing process • Configuration Management • Exam #2

More Related