object oriented metrics l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Object-Oriented Metrics PowerPoint Presentation
Download Presentation
Object-Oriented Metrics

Loading in 2 Seconds...

play fullscreen
1 / 49

Object-Oriented Metrics - PowerPoint PPT Presentation


  • 203 Views
  • Uploaded on

Object-Oriented Metrics. Renaat Verbruggen School of Computing, Dublin City University. Introduction and Welcome. My Background Esprit Object-oriented Project 1986 Lecturer in Software Engineering Research in formal OO Consultant to Industry on OO. Agenda.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Object-Oriented Metrics' - fancy


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
object oriented metrics

Object-Oriented Metrics

Renaat Verbruggen

School of Computing,

Dublin City University

introduction and welcome
Introduction and Welcome
  • My Background
    • Esprit Object-oriented Project 1986
    • Lecturer in Software Engineering
    • Research in formal OO
    • Consultant to Industry on OO
agenda
Agenda
  • What is different about Object-oriented projects?
  • What is measurement in software development ?
  • What are typical OO metrics ?
  • What other guidelines exist?
differences in oo projects 1
Differences in OO projects 1
  • The code produced.
    • Encapsulation
    • Data Abstraction (Classes)
    • Inheritance
    • Polymorphism
  • Not just Lines of Code.
differences in oo projects 2
Differences in OO projects 2
  • Reuse a priority
    • Use (reuse) of libraries or Frameworks
    • Reuse through Inheritance
    • Reuse above code level
      • Patterns
      • Business objects
differences in oo projects 3
Differences in OO projects 3
  • Reuse changes process
    • Build reusable components
      • Frameworks and libraries
      • Abstraction, generalisation
      • Cost  Investment
    • Find and reuse components
      • Saving  return on investment
differences in oo projects 4
Differences in OO projects 4
  • Development process iterative
    • Often the major difference !
    • Growth of software over iterations
    • Reuse-based
    • Change considered explicitly
    • Support for risk management
  •  Need for early and updated metrics
reasonin g 1
Reasoning 1
  • Tom DeMarco:
    • ''You cannot control what you cannot measure.''
  • Clerk Maxwell:
    • ''To measure is to know.''
reasonin g 2
Reasoning 2
  • Lord Kelvin:
    • ''The degree to which you can express something in numbers is the degree to which you really understand it.''
  • Louise Pasteur:
    • ''A science is as mature as its measurement tools.''
experience 1
Experience 1
  • Lowell Arthur:
    • ''Better a little caution than a great regret.''
  • Victor Basili:
    • ''Measurement is an excellent abstraction mechanism for learning what works and what doesn't.''
experience 2
Experience 2
  • Frederick Brooks:
    • ''Adding manpower to a late software project makes it later.''
  • Tom Gilb:
    • ''Project without clear goals will not achieve their goal clearly.''
experience 3
Experience 3
  • Law of Parkinson:
    • ''Work expands to fill the available time.''
measurement
Measurement
  • What is measurement ?

"Measurement is the process by which numbers or symbols are assigned to attributes of entities in the real world in such a way as to describe them according to clearly defined rules." (Norman Fenton)

measurement14
Measurement
  • Units of Measurement
      • Measure a person’s temperature
        • Celsius
        • Fahrenheit
        • “Feverish” - too hot (analogy)
      • Accuracy
      • Replicability
  • How do we measure software ?
measurable project goals
Measurable Project Goals
  • Are the following measurable goals ?
    • The software will not fail
    • The software will be easy to use
    • The project will be completed by June 30
    • The product will meet all requirements
  • What makes a goal measurable ?
setting measurable goals
Setting Measurable Goals
  • Metric Definition
    • Clarity
    • Non-ambiguous
    • Common Understanding
    • Replicability
    • Accuracy ?
    • Examples
setting up measures 1
Setting up Measures 1
  • Establish why you are measuring:
    • Goal-Question-Metric(GQM)
      • 1. define goal or purpose
      • 2. Break down into questions
      • 3. Pick suitable metrics
    • Create a Metrics programme within company
    • Choose metrics already developed
setting up measures 2
Setting up Measures 2
  • Create your own metrics ?
    • Define the metric as completely as possible
      • Define the properties of attribute to be measured
      • Define the object to be measured , the metric’s domain
      • Define the metric.
    • Formality is essential
setting up measures 3
Setting up Measures 3
  • Validate the metric theoretically
    • Prove that the properties are met
    • Prove that the dimensions of the metric are sound
    • Determine the scale for the metric
setting up measures 4
Setting up Measures 4
  • Validate the metric practically
    • devise best means to measure
      • level of automation
      • minimum disruption for developers
    • Use metric in several practical places.
    • Promote metric
setting up measures 5
Setting up Measures 5
  • Example:
    • Attribute to be measured: product size
    • Essential property: positive, additive
    • Metric domain: set of lines ending with ‘\n’
    • Metric Name:# LOC
  • Theory
    • LOC is an absolute scale type
    • Fulfils essential property of product size
setting up measures 6
Setting up Measures 6
  • Yet LOC has problems - why?
  • Because it is modelled simplistically
  • ‘\n’ s are just one element to the product size.
  • Used to try to capture too much about the software.
software metrics validation
Software Metrics Validation
  • "Validation of a software measure is the process of ensuring that the measure is a proper numerical characterisation of the claimed attribute; this means showing that the representation condition is satisfied.”
  • Norman Fenton, City University London
weighted methods per class as sum of the mccabe numbers
Weighted Methods Per Class: (as sum of the McCabe numbers)
  • The number of methods and the complexity of methods involved is an indicator of how much time and effort is required to develop and maintain a class.
  • The larger the number of methods in a class the greater the potential impact on children, since children will inherit all the methods defined in the class.
  • Classes with large numbers of methods are likely to be more application specific, limiting the possibility of reuse.''
mccabe s cyclomatic complexity
McCabe’s Cyclomatic Complexity
  • Based on the control graph of the program
  • Can be used to decide on basis path testing etc.
  • no. linearly independent paths =

no. of edges - no of nodes + modules

depth of inheritance tree
Depth of Inheritance Tree
  • The deeper a class is in the hierarchy, the greater the number of methods it is likely to inherit, making it more complex to predict its behaviour.
  • Deeper trees constitute greater design complexity, since more methods and classes are involved.
  • The deeper a particular class is in the hierarchy, the greater the potential reuse of inherited methods.
number of children as number of immediate sub classes
Number of Children:(as number of immediate sub-classes)
  • Greater the number of children, greater the reuse, since inheritance promotes reuse.
  • Greater the number of children, the greater the likelihood of improper abstraction of the parent class. If a class has a large number of children, it may be a case of misuse of sub-classing.
  • The number of children gives an idea of the potential influence a class has on the design. If a class has a large number of children, it may require more testing of the methods in that class.
response for a class as number of used methods
Response For a Class: (as number of used methods)
  • If a large number of methods can be invoked in response to a message, the testing and debugging of the class becomes more complicated since it requires a greater level of understanding required on the part of the tester.
  • The larger the number of methods that can be invoked from a class, the greater the complexity of the class.
  • A worst case value for possible responses will assist in appropriate allocation of testing time.
coupling between objects 1
Coupling Between Objects 1
  • Excessive coupling between objects is detrimental to modular design and prevents reuse. The more independent an object is, the easier it is to reuse it in another application.
  • In order to improve modularity and promote encapsulation, inter-object couples should be kept to a minimum. The larger the number of couples, the higher the sensitivity to changes in other parts of the design and therefore maintenance is more difficult.
coupling between objects 2
Coupling Between Objects 2
  • A measure of coupling is useful to determine how complex the testing of various parts of a design are likely to be. The higher the inter-object coupling, the more rigorous the testing needs to be.''
lack of cohesion in methods disjunctive instance variables
Lack of Cohesion in Methods: (disjunctive instance variables)
  • Cohesiveness of methods within a class is desirable, since it promotes encapsulation.
  • Lack of cohesion implies classes should probably be split into two or more sub/classes.
  • Any measure of disparateness of methods helps identify flaws in the design of classes.
  • Low cohesion increases complexity, thereby increasing the likelihood of errors during the development process.
design metrics and experience 1
Design Metrics And Experience 1
  • From Mark Lorenz
  • 1. The average method size should be less than 8 LOC for Smalltalk and 24 LOC for C++. Bigger averages indicate O-O design problems (i.e. function-oriented coding).
  • 2. The average number of methods per class should be less than 20. Bigger averages indicate too much responsibility in too few classes.
design metrics and experience 2
Design Metrics And Experience 2
  • 3. The average number of instance variables per class should be less than 6. Similar in reasoning as the previous point - more instance variables indicate that one class is doing more than it should.
  • 4. The class hierarchy nesting level should be less than 6. Start counting at the level of any framework classes that you use or the root class if you don't.
  • 5. The number of subsystem-to-subsystem relationships should be less than the average number of class-to-class relationships within a subsystem.
design metrics and experience 3
Design Metrics And Experience 3
  • 6. The number of class-to-class relationships within a subsystem should be relatively high.
  • 7. The instance variable usage by the methods within a class can be used to look for possible design problems.
  • 8. The average number of comment lines should be greater than 1. Smaller averages indicate too little documentation with the (small) methods.
  • 9. The number of problem reports per class should be low.
design metrics and experience 4
Design Metrics And Experience 4
  • 10. The number of times a class is reused across the original application and in other applications might indicate a need to redesign it.
  • 11. The number of classes and methods thrown away should occur at a steady rate throughout most of the development process.
other experience 1
Other Experience 1
  • 1. A prototype class has 10 to 15 methods, each with 5 to 10 lines of code, and takes 1 person-week to develop.
  • 2. A production class has 20 to 30 methods, each with 10 to 20 lines of code, and takes 8 person-weeks to develop. In both these cases, development includes documentation and testing.
  • 3. C++ will have 2 to 3 times the lines of code of Smalltalk.
  • 4. Code volume will expand in the first half of the project and decline in the second half, as review clean up the system.
other experience 2
Other Experience 2
  • 5. Deeply nested classes are more complex, due to inheritance.
  • 6. A class or group of classes (e.g., a framework) with a low amount of coupling to other classes will be more reusable.
  • 7. A class has higher cohesion if its methods utilise similar sets of instance variables.
project completion metrics and experience 1
Project Completion Metrics And Experience 1
  • 1.The average number of support classes per key class ... will help you to estimate the total number of classes in the final system.
  • 2. The average man-days per application class ... to estimate the amount of human resources you need to complete the project.
  • 3. The average number of classes per developer ... will help you decide what staffing level needed to develop the application.
project completion metrics and experience 2
Project Completion Metrics And Experience 2
  • 4. The number of major iterations ... will help you schedule times when early-release drivers can be given to customers and human factors staff to verify requirements and usability.
  • 5. The number of subsystems should relate to major functionally-related portions of the total business' system.
establishing a metrication programme 1
Establishing A Metrication Programme 1
  • Barbara Kitchenham:
    • 1. Define your goals (which are likely to include requirements for measurements to support basic management activities).
    • 2. Identify measures that will support the monitoring and achievement of those goals.
    • 3. Define and plan the metrication programme.
establishing a metrication programme 2
Establishing A Metrication Programme 2
  • 4. Establish a data collection system to support the metrication programme.
  • 5. Analyse your collected data to support your management activities and monitor achievement of your goals.
  • 6. Review and update your goals in the light of your achievements.
establishing a metrication programme 3
Establishing A Metrication Programme 3
  • 7. Update your data collection in the light of your changing goals and management requirements."
software metrics application 1
Software Metrics Application 1
  • "What do the successful companies do:
  • They have 'decriminalized' errors. People talk openly about what they have done wrong as a means of self-improvement. There is no need to hide failure; management is not allowed to, or simply does not, use it against you.
software metrics application 2
Software Metrics Application 2
  • Measurement is part of 'how we do business.' That is, there is no management mandate or policy that causes measurement to happen, but rather a common understanding that it is the only reasonable way to build product."
software metrics application 3
Software Metrics Application 3
  • They tend to have achieved an SEI process level of 4 or 5 (very good) without ever having passed through level 3. That is, they measure and use feedback to improve their software process without ever having invoked a defined process! (That is, of course, the epitome of technologist/experimentation vs. management/control.)
object oriented metrics47
Object Oriented Metrics
  • Process of development tends to be different.
  • Project should not be penalised for this
  • Or allowed too much free rein !
  • Metrics are a very useful addition to an object-oriented project.
new guidelines
New Guidelines
  • Warning Signs
  • RFC > 100
  • RFC > 5* NMC
  • CBO > 5
  • WMC > 100
  • NMC > 40
overall
Overall
  • Far more important to validate current metrics empirically than propose new ones
  • Aim to make link to productivity, quality and project management