1 / 35

Measurement in the Systems Domain

Measurement in the Systems Domain. Ronan Fitzpatrick School of Computing, Dublin Institute of Technology. March 2007. Overview. Quotations Overview of measurement Mainstream systems metrics Categories of Knowledge Management metrics Deriving and validating metrics

indiya
Download Presentation

Measurement in the Systems Domain

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measurement in the Systems Domain Ronan Fitzpatrick School of Computing, Dublin Institute of Technology. March 2007

  2. Overview • Quotations • Overview of measurement • Mainstream systems metrics • Categories of Knowledge Management metrics • Deriving and validating metrics • Positioning measurement.

  3. Quotations • “Measure what is measurable, and make measurable what is not so”. (Galileo Galilei)

  4. Quotations • “I often say that when you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meagre and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely in your thoughts advanced to the state of science, whatever the matter may be”. (Lord Kelvin, 1883)

  5. Quotations • “You can’t control what you can’t measure”. (DeMarco, 1982; p3) • “We must be bold in our attempts at measurement. Just because no one has measured some attribute of interest does not mean that it cannot be measured satisfactorily”. (Fenton and Pfleeger, 1996; p20)

  6. Measurement – The vocabulary • Measure • Indirect measure • Metric • Software metrics

  7. A structural model of measurement – Kitchenham et. al. (1995)

  8. Software measurement • Applying the formality of scientific measurement to software products. • Definition • Software measurements [that is] using numerical ratings to measure the complexity and reliability of source code, the length and quality of the development process and the performance of the application when completed. (Online computing dictionary by Farlex).

  9. Software measurement • Predictive • Assessment

  10. Software metrics • Tom Gilb (1976) explains that in his book “the term ‘metrics’ simply means measures … that are quantified numerically and have useful accuracy and reliability”.

  11. Halstead’s Elements of software science • Proposed by Halstead in 1972 & 1975 • Hypothesised that algorithms, considered as distillations of thought, may possess a general structure which obeys physical laws. • Based on the number of operators and operands in a program. • Measurement of small algorithms yields data, which is suitable for estimating the time required to program the algorithms.

  12. Gilb’s (1976)- Bebugging • Gilb (1976) devised bebugging to measure the number of errors in a program. • Introduced intentional errors into a program • Based on the percent of these intentional errors that testers found he argued for an estimate of how many actual errors testers would find.

  13. Uses a simple formula McCabe’s (1976)Software complexity measure

  14. Uses a formula for Ci = measures network complexity Ri = measures the tree-impurity of level i against level 0 Di = measures the tree-impurity of level i against level i-1. Ni; number of modules from level 0 to level i Ai; number of module network arcs from level 0 to level i Ti = Ni-l; number of module tree arcs from level 0 to level i N'i; number of modules and data base references from level 0 to level i A'I; number of module and data base network arcs from level 0 to level i T'i; number of module and data base tree arcs from level 0 to level i Yin and Winchester’s (1978)Graph structure complexity measurement

  15. Albrect - (1979)Function Point analysis Function Points = UFPs x (0.65 + 0.01 x (DI1 to DI14))

  16. Assuming Di = 3 for all 14 characteristics (DI1 to DI14) = 14 x 3 = 42 FP = 523 x [0.65 + 0.01 x 42] = 523 x [0.65 + 0.42] = 560 Albrect - (1979)Function Point analysis DI Values Not present or No influence = 0 Insignificant influence = 1 Moderate influence = 2 Average influence = 3 Significant influence = 4 Strong influence throughout = 5

  17. Albrect - (1979)Function Point analysis • Having calculated the Function points • Use number of lines of code (LOC) per function point in order to calculate total lines of Code in a project. • LOC varies per programming language.

  18. COCOMO - (1981) • Originally named COCOMO, • the COnstructive COst MOdel • Devised by Barry Boehm in 1981 • a method for estimating project cost, effort, and schedule. • Has since been re-designated COCOMO 81. • The metrics of COCOMO 81 are styled • Person-Months (PM), Time to Develop (TDEV) and Thousands of Delivered Source Instructions (KDSI).

  19. COCOMO - formulae The general COCOMO 81 formulae for all modes are: The constants for the general formulae for Basic COCOMO 81 that have been established by Boehm are: Organic PM = 2.4(KDSI)1.05 TDEV = 2.5(PM)0.38 Semidetached PM = 3.0(KDSI)1.12 TDEV = 2.5(PM)0.35 Embedded PM = 3.6(KDSI)1.20 TDEV = 2.5(PM)0.32

  20. Based on formula Henry and Hafura - (1981)Information flow complexity

  21. Proprietary DeMarco - (1982)Bang per Buck (BPB)

  22. Modern measurement • Usability • Effectiveness, Efficiency, Safety and Satisfaction • Heuristic evaluation • OO measurement • Internet measurement • Website measurement • Accessibility

  23. Underpinning motivation • Generally based on the notion that • some elements of an entity can be measured • a formula can be derived that uses those measures • a value can be calculated that is a reliable predictor of some attribute of an entity.

  24. Categories of KMmetrics (Bose, 2004) • Human capital • Structural capital • Customer capital • Organisational capital • Innovation capital • Process capital • Intellectual capital • Other intangible assets

  25. Deriving and Validating metrics – models, methods and methodology • Models • COCOMO • Function Points • Factor-Criteria-metric • Goal/Question/Metric paradigm

  26. Deriving and Validating metrics – models, methods and methodology • Methods • Theoretical validation • “is concerned with demonstrating that a measure is [mathematically] measuring the concept it is purporting to measure”. Briand et al., (1998) • Empirical validation • “validation of prediction systems involves experimentation and hypothesis testing. Rather than being a mathematical proof, validation involves confirming or refuting the hypothesis”. Fenton and Pfleeger (1996:p104)

  27. Methodology Stages of the metrics methodology (Shepperd and Ince, 1993)

  28. Using metrics • Complexity measurement • Systems sizing and estimating • Production control • Quality assurance

  29. A model for a metric validation study - Fenton & Pfleeger (1996;p125) • Conception • e.g., a measure of a website’s design is a valid predictor of a visitor’s engagement experience when visiting the website. • Design • State an hypothesis • Preparation • e.g., the size of the study – websites and visitors, The study environment, The empirical validation team, The timescale • Execution – Data gathering • Analysis • Using statistical methods and data analysis techniques to show the validity of the predictor. • Documentation and decision making

  30. Statistical analysis

  31. Positioning measurement • Numbers - Scientific domain • Ancient civilisations • Numbers in the Arts? • Van Gough • Read & Write • Numbers in nature? • What have they got to do with a mother and her new born babe?

  32. Beware the term • It is advocated that the term ‘metric’ should be avoided. • Card (2003) writes “ • Avoid use of [the term] metric (a term not used in CMM or CMMI)".

  33. References • IEEE Std 1061 (1998) IEEE Standard for a Software Quality Metrics Methodology, IEEE Computer Society, Institute of Electrical and Electronics Engineers, Inc., 345 East 47th Street, New York, NY 1W1Z USA • ISO/IEC TR 9126-4 (2004) International Standard. Software engineering – Product quality, Part 4: Quality in use metrics, British Standards Institution, 389 Chiswick High Road, London, UK • Shepperd, M.J. and Ince, D. (1993) Derivation and validation of Software Metrics, Clarendon Press, Oxford, UK • Schneidewind, N.F. (1992) Methodology for validating software metrics, IEEE Transactions on Software Engineering, IEEE Computer Society, Los Alamitos, CA, USA, Vol 18(2) p410-422 • Schneidewind, N. F. (1994) Validating Metrics for Ensuring Space Shuttle Flight Software Quality, Computer, IEEE Computer Society, Los Alamitos, CA, USA, Vol 27(8) p50-57

  34. Conclusion • Quotations • Overview of measurement • Mainstream systems metrics • Categories of Knowledge Management metrics • Deriving and validating metrics • Positioning measurement.

  35. Questions?

More Related