1 / 44

Software Metrics

Software Metrics. Software Engineering. Overview. Software process and project metrics are quantitative measures that enable software engineers to gain insight into the efficiency of the software process and the projects conducted using the process framework.

nimrod
Download Presentation

Software Metrics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Metrics Software Engineering

  2. Overview • Software process and project metrics are quantitative measures that enable software engineers to gain insight into the efficiency of the software process and the projects conducted using the process framework. • In software project management, we are primarily concerned with productivity and quality metrics. • There are four reasons for measuring software processes, products, and resources • to characterize, • to evaluate, • to predict, • to improve

  3. SOFTWARE METRICS FOR PROCESS AND PROJECTS Software Process Metrics and Project Metrics are Quantitative Measures that enable Software Professionals to gain insight into the efficacy of Software Process and the Project that are conducted using the Process as a Framework. Quality and Productivity data are collected, analysed and compared against past averages to : Pinpoint/Uncover Problem areas before they “Go critical” Track potential Risks Assess Productivity improvements. Assess the status of an ongoing Project Adjust work flow or Tasks Evaluate the Project Team’s ability to Control Quality of Software Work Product.

  4. Measures that are collected by a Project team and converted into Metrics for use during a Project can also be transmitted to those with responsibility for Software Process improvement. • For this reason, many of the same Metrics are used in both the Process and Project domain.

  5. SOFTWARE METRICS FOR PROCESS AND PROJECTS WHO DOES IT ? Software Measures are often collected by Software Engineers/ Software Practitioner. Software Metrics are analyzed and assesses by Software Managers. WHY METRICS are IMPORTANT? If you do not measure, your judgment can be based only on subjective evaluations. With Measurement:- - Trends can be spotted, - Better estimates can be made, - True improvement can be accomplished

  6. SOFTWARE METRICS FOR PROCESS AND PROJECTS WHAT ARE THE STEPS? 1. Defining a limited set of Process and Project that are easy to collect. 2. The result is analyzed and compared to ‘’Past Average’’ for similar Project performed within the organization. 3. Trends are assessed and conclusions are generated. WORK PRODUCT? A set of Software Metrics that provide insight into the Process and understanding of the Project. - Productivity Metrics - Quality Metrics

  7. Definitions • Measure - quantitative indication of extent, amount, dimension, capacity, or size of some attribute of a product or process. • E.g., Number of errors • Metric - quantitative measure of degree to which a system, component or process possesses a given attribute. “A handle or guess about a given attribute.” • E.g., Number of errors found per person hours expended

  8. Why Measure Software? • Determine the quality of the current product or process • Predict qualities of a product/process • Improve quality of a product/process

  9. Motivation for Metrics • Estimate the cost & schedule of future projects • Evaluate the productivity impacts of new tools and techniques • Establish productivity trends over time • Improve software quality • Forecast future staffing needs • Anticipate and reduce future maintenance needs

  10. Example Metrics • Defect rates • Error rates • Measured by: • individual • module • during development • Errors should be categorized by origin, type, cost

  11. Metric Classification • Products • Explicit results of software development activities • Deliverables, documentation, by products • Processes • Activities related to production of software • Resources • Inputs into the software development activities • hardware, knowledge, people

  12. Product vs. Process • Process Metrics • Insights of process paradigm, software engineering tasks, work product, or milestones • Lead to long term process improvement • Product Metrics • Assesses the state of the project • Track potential risks • Uncover problem areas • Adjust workflow or tasks • Evaluate teams ability to control quality

  13. Types of Measures • Direct Measures (internal attributes) • Cost, effort, LOC, speed, memory • Indirect Measures (external attributes) • Functionality, quality, complexity, efficiency, reliability, maintainability

  14. Size-Oriented Metrics • Size of the software produced • LOC - Lines Of Code • KLOC - 1000 Lines Of Code • SLOC –Statement Lines of Code (ignore whitespace) • Typical Measures: • Errors/KLOC, Defects/KLOC, Cost/LOC, Documentation Pages/KLOC

  15. Usage of Code Complexity Metrics • Predict critical information about reliability, portability, maintainability etc. • Design and execute test suites. • Provide continuous feedback throughout a • Software project. • Reward good programmers.

  16. LOC Metrics • Easy to use • Easy to compute • Language & programmer dependent

  17. Complexity Metrics • LOC - a function of complexity • Language and programmer dependent • Halstead’s Software Science (entropy measures) • n1 - number of distinct operators • n2 - number of distinct operands • N1 - total number of operators • N2 - total number of operands

  18. Example if (k < 2) { if (k > 3) x = x*k; } • Distinct operators: if ( ) { } > < = * ; • Distinct operands: k 2 3 x • n1 = 10 • n2 = 4 • N1 = 13 • N2 = 7

  19. Halstead’s Metrics • Amenable to experimental verification [1970s] • Program length: N = N1 + N2 • Program vocabulary: n = n1 + n2 • Estimated length: = n1 log2 n1 + n2 log2 n2 • Close estimate of length for well structured programs • Difficulty D=(N1/2 )x (N2/n2) • Volume V= Nxlog2n • Effort E=DxV • Purity ratio: PR = /N

  20. Program Complexity • Volume: V = N log2 n • Number of bits to provide a unique designator for each of the n items in the program vocabulary. • Difficulty • Program effort: E=D*V • This is a good measure of program understandability

  21. McCabe’s Complexity Measures • McCabe’s metrics are based on a control flow representation of the program. • A program graph is used to depict control flow. • Nodes represent processing tasks (one or more code statements) • Edges represent control flow between nodes

  22. Flow Graph Notation While Sequence If-then-else Until

  23. What Cyclomatic Complexity? • A software metric used to measure the complexity of software • Developed by Thomas McCabe • Described (informally) as the number of decision points + 1

  24. Cyclomatic Complexity V(G) Computing the cyclomatic complexity: number of simple decisions + 1 or number of enclosed areas + 1 In this case, V(G) = 4 From Pressman Slides - Software Engineering a Practical Approach 6,e

  25. Graph Complexity (Cyclomatic Complexity) A number of industry studies have indicated that the higher V(G), the higher the probability or errors. modules V(G) modules in this range are more error prone From Pressman Slides - Software Engineering a Practical Approach 6,e

  26. 1 2 3 4 5 6 7 8 Basis Path Testing Next, we derive the independent paths: Since V(G) = 4, there are four paths Path 1: 1,2,3,6,7,8 Path 2: 1,2,3,5,7,8 Path 3: 1,2,4,7,8 Path 4: 1,2,4,7,2,4,...7,8 Finally, we derive test cases to exercise these paths. From Pressman Slides - Software Engineering a Practical Approach 6,e

  27. Example: What is the complexity? public void howComplex() { int i=20; while (i<10) { System.out.printf("i is %d", i); if (i%2 == 0) { System.out.println("even"); } else { System.out.println("odd"); } } }

  28. What is the complexity V(G)? public void howComplex() { int i=20; while (i<10) { System.out.printf("i is %d", i); if (i%2 == 0) { System.out.println("even"); } else { System.out.println("odd"); } } } V(G) = 2 enclosed area + 1 = 3

  29. Cyclomatic Complexity • Set of independent paths through the graph (basis set) • V(G) = E – N + 2 • E is the number of flow graph edges • N is the number of nodes • V(G) = P + 1 • P is the number of predicate nodes

  30. Example i = 0; while (i<n-1) do j = i + 1; while (j<n) do if A[i]<A[j] then swap(A[i], A[j]); end do; i=i+1; end do;

  31. Flow Graph 1 2 3 7 4 5 6

  32. Computing V(G) • V(G) = 9 – 7 + 2 = 4 • V(G) = 3 + 1 = 4 • Basis Set (Testing Chapter) • 1, 7 • 1, 2, 6, 1, 7 • 1, 2, 3, 4, 5, 2, 6, 1, 7 • 1, 2, 3, 5, 2, 6, 1, 7

  33. Another Example 1 9 2 4 3 5 6 7 8 What is V(G)?

  34. Meaning • V(G) is the number of (enclosed) regions/areas of the planar graph • Number of regions increases with the number of decision paths and loops • A quantitative measure of testing difficulty and an indication of ultimate reliability • Experimental data shows value of V(G) should be no more then 10 - testing is very difficulty above this value

  35. Metrics and Software Quality FURPS • Functionality - features of system • Usability – aesthesis, documentation • Reliability – frequency of failure, security • Performance – speed, throughput • Supportability – maintainability

  36. Measures of Software Quality • Correctness – degree to which a program operates according to specification • Defects/KLOC • Defect is a verified lack of conformance to requirements • Failures/hours of operation • Maintainability – degree to which a program is open to change • Mean time to change • Change request to new version (Analyze, design etc) • Cost to correct • Integrity - degree to which a program is resistant to outside attack • Fault tolerance, security & threats • Usability – easiness to use • Training time, skill level necessary to use, Increase in productivity, subjective questionnaire or controlled experiment

  37. M a i n t a i n a b i l i t y M a i n t a i n a b i l i t y P o r t a b i l i t y P o r t a b i l i t y F l e x i b i l i t y F l e x i b i l i t y R e u s a b i l i t y R e u s a b i l i t y T e s t a b i l i t y T e s t a b i l i t y I n t e r o p e r a b i l i t y I n t e r o p e r a b i l i t y P R O D U C T R E V I S I O N P R O D U C T R E V I S I O N P R O D U C T T R A N S I T I O N P R O D U C T T R A N S I T I O N P R O D U C T O P E R A T I O N P R O D U C T O P E R A T I O N C o r r e c t n e s s C o r r e c t n e s s U s a b i l i t y U s a b i l i t y E f f i c i e n c y E f f i c i e n c y I n t e g r i t y R e l i a b i l i t y I n t e g r i t y R e l i a b i l i t y McCall’s Triangle of Quality • Product Revision: Ability to undergo changes • Product Transition: adaptability to new environments • Product Operations: operation characteristics

  38. A Comment McCall’s quality factors were proposed in the early 1970s. They are as valid today as they were in that time. It’s likely that software built to conform to these factors will exhibit high quality well into the 21st century, even if there are dramatic changes in technology.

  39. product operation revision transition testability reliability efficiency usability maintainability portability reusability Quality Model flexibility correctness integrity Metrics

  40. Quality Model • Product revision includes • maintainability : the effort required to locate and fix a fault in the program within its operating environment • flexibility : the ease of making changes required by changes in the operating environment • testability: the ease of testing the program, to ensure that it is error-free and meets its specification • Product transition is all about • portability : the effort required to transfer a program from one environment to another • reusability : the ease of reusing software in a different context interoperability : the effort required to couple the system to another system • Quality of product operations depends on • correctness : the extent to which a program fulfils its specification • reliability : the systems ability not to fail • efficiency : further categorized into execution efficiency and storage efficiency and generally meaning the use of resources, (e.g. processor time, storage) • integrity : the protection of the program from unauthorized access • usability : the ease of the software

  41. High level Design Metrics • Structural Complexity • Data Complexity • System Complexity • Card & Glass ’80 • Structural Complexity S(i) of a module i. • S(i) = fout2(i) • Fan out is the number of modules immediately subordinate (directly invoked).

  42. Using Metrics • The Process • Select appropriate metrics for problem • Utilized metrics on problem • Assessment and feedback • Formulate • Collect • Analysis • Interpretation • Feedback

  43. Metric tools • McCabe & Associates ( founded by Tom McCabe, Sr.) • The Visual Quality ToolSet • The Visual Testing ToolSet • The Visual Reengineering ToolSet • Metrics calculated • McCabe Cyclomatic Complexity • McCabe Essential Complexity • Module Design Complexity • Integration Complexity • Lines of Code • Halstead

  44. More tools on Internet • A Source of Information for Mission Critical Software Systems, Management Processes, and Strategies http://www.niwotridge.com/Resources/PM-SWEResources/SWTools.htm • Defense Software Collaborators (by DACS) http://www.thedacs.com/databases/url/key.hts?keycode=3 http://www.qucis.queensu.ca/Software-Engineering/toolcat.html#label208 • Object-oriented metrics http://me.in-berlin.de/~socrates/oo_metrics.html • Software Metrics Sites on the Web (Thomas Fetcke) • Metrics tools for C/C++ (Christofer Lott) http://www.chris-lott.org/resources/cmetrics/

More Related