1 / 15

Misleading Metrics and Unsound Analyses

Authors: Barbara Kitchenham, David Ross Jeffery, and Colin Connaughton. Misleading Metrics and Unsound Analyses. Presenter: Gil Hartman. IEEE Software 24(2) , pp. 73-78, Mar-Apr 2007. About the authors.

redford
Download Presentation

Misleading Metrics and Unsound Analyses

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Authors: Barbara Kitchenham, David Ross Jeffery, and Colin Connaughton Misleading Metricsand Unsound Analyses Presenter: Gil Hartman IEEE Software24(2), pp. 73-78, Mar-Apr 2007

  2. About the authors Barbara Kitchenham - Professor of quantitative software engineering at Keele University, GB David Ross Jeffery - Professor of software engineering at the University of NSW, Australia Colin Connaughton - Metrics consultant for IBM’s Application Management Services, Sydney

  3. Introduction • Software Project management – predicting and monitoring software development projects • Measurement is a valuable software-management support tool • Unfortunately, some of the “expert” advice can encourage the use of misleading metrics

  4. Metrics in AMS • Data is from Application Management Services delivery group of IBM Australia • A CMM level 5 organization using standard metrics and analyses • The program was intended to confirm each project’s productivity and to set improvement targets on future projects

  5. ISO/IEC 15939 Software Measurement Process • Indicator: Average productivity • Function: Divide project X lines of code by project Y hours of effort • Model: Compute mean and standard deviation of all project productivity values • Decision criteria: Computed confidence intervals based on the standard deviation

  6. Non-normal data distributions • Frequency plot of the AMS productivity data over four years. • The Simple average isn’t a good estimate of a typical project’s productivity.

  7. Productivity for application 1 • Standard deviation for all projects is very large. • The mean and standard deviations of the total data, don’t necessarily relate to a specific application.

  8. Application 2 • What can we conclude from the standard run plot?

  9. Scatter plot vs run chart

  10. Scatter plot vs run chart Productivity = Function points / Effort

  11. Application 3

  12. Run charts • Advantages • Can identify productivity trends over time • provide a comparison with overall mean values • Disadvantages • actual productivity values are difficult to interpret • mean and standard deviation can be inflated by high-productivity values for small unimportant projects

  13. Lessons learned - DO • Base all analysis of project data on data from similar projects • Use graphical representations of productivity data • Use the relationship between effort and size to develop regression models • Logarithmic transformations • actual effort vs predicted effort • Statistical confidence intervals

  14. Lessons learned - DON’T • Use the mean and standard deviation for either monitoring or prediction purposes • Analyze projects that are dissimilar simply to get more data • Use any metrics that are constructed from the ratio of two independent measures unless you’re sure you understand the measure’s implications

  15. Conclusion • Charts and metrics can sometimes be misleading. • But they often help display statistics and data in a perceptible way.

More Related