1 / 9

A Hierarchical Model for Object-Oriented Design Quality Assessment by J. Bansiya and C.G. Davis

A Hierarchical Model for Object-Oriented Design Quality Assessment by J. Bansiya and C.G. Davis. Rationale for this Research. OO paradigm requires a reassessment of the elements to evaluate software quality Encapsulation Inheritance Polymorphism

jed
Download Presentation

A Hierarchical Model for Object-Oriented Design Quality Assessment by J. Bansiya and C.G. Davis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Hierarchical Model for Object-Oriented Design Quality Assessmentby J. Bansiya and C.G. Davis

  2. Rationale for this Research • OO paradigm requires a reassessment of the elements to evaluate software quality • Encapsulation • Inheritance • Polymorphism • There exists a need to have metrics and model that can be applied to earlier phase of the project, such as to the design, so that something can be done to the quality of the product (not after it is coded).

  3. Started with ISO 9126 attributes and modified them • 6 Quality Attributes: • 1)Functionality - - kept • Reliability - - excluded (implementation oriented) • 2) Efficiency- - changed to Effectiveness of design • Usability - - - excluded (implementation oriented) • 3) Maintainability – changed to Understandability • 4) Portability - - changed to Extendibility • 5) Reusability- - added • 6) Flexibility- - added

  4. 11O.O. Design Properties & associated Metrics • Design Size – number of classes • Hierarchies – number of class hierarchies • Abstraction – average number of ancestors for the classes • Encapsulation – ratio of private attributes/ total attributes • Coupling – number of classes that pass messages or share attributes • Cohesion – ratio of intersection of parameters of methods in a class to all the parameters in the class • Composition – number of data declarations whose type is a user defined class • Inheritance – ratio of number of methods inherited by a class to the total number of methods accessible by the methods in the class • Polymorphism – number of methods that exhibit polymorphic behavior • Messaging - number of public methods in a class • Complexity – number of methods defined in a class (my comment- how about attributes in the class?)

  5. Quality Attribute .vs. Design Properties Indexing Scheme Quality Attributes Index computation using Design Properties -.25* coupling + .25* cohesion + .5* messaging + .5* design size Reusability .25* encapsulation -.25* coupling +.5* composition + .5* polymorphism Flexibility -.33* abstraction +.33* encapsulation -.33* coupling +.33* cohesion -.33* polymorphism -.33* complexity -.33* design size Understandability Functionality .12* cohesion+ .22* polymorphism +.22* messaging+.22* design size +.22* hierarchies .5* abstraction -.5*coupling + .5* inheritance +.5* polymorphism Extendibility .2* abstraction + .2* encapsulation+ .2* composition+ .2* inheritance + .2* polymorphism Effectiveness Note: that the weights for the design properties adds up to 1 for each quality attribute

  6. Performed Several Studies • Individual Quality Attribute Validation: • Over two sets of products • 5 releases of MFC • 4 releases of Borland Object Window library • Except for Understandability all other quality attributes increased in value (improved) from release to release. • Computation is based on computing the metrics for all 11 design properties for each of the releases. (table 8) • Normalized the metric using the first release as the base (table 9) • Computed the index for each of the 6 quality attributes for each of the release. (table 10) • Plotted the quality attributes’ indices for the releases (figs. 2 and 3) • Understandability decreased because more functions are added for each release - - - - the index bears that out. (table 10)

  7. Evaluated the Model as a whole • Took 14 projects • Assigned 13 “experts’ to evaluate the 14 projects based on the 6 Quality Attributes and asked them to rank the project (table 11). • Evaluated the 14 projects based on the 11 design properties (table 12) • Normalized the table 12 information (table 13) • Computed the indices for the 6 quality attributes of each of the 14 projects (table 14) • Ranked the 14 projects based on the indices computed for each of the 6 quality attributes for all 14 projects (table 11 QMOOD Ranking --- which is based on table 14). • Performed a correlation analysis of the QMOOD ranking against the 13 evaluators rankings. • *** 11 of the 13 evaluators correlated well with the QMOOD ranking. (table 15)**** (see next chart on how to get table 15)

  8. How to get to Table 15 Eval. 1, Eval. 2 , - - - - - Eval 13, QMOOD d d2 proj. 1 1 12 13 1 16 4 2 13 9 2 3 1 1 3 9 9 12 3 4 0 1 1 0 5 5 3 9 8 6 11 7 49 4 7 5 25 3 8 8 5 7 2 4 9 2 4 16 10 6 4 7 49 11 11 6 1 1 12 7 10 0 0 13 10 0 14 0 14 14 Sum of d2= 180 r = 1 – [ (6 * sum d2)/( n (n2 -1)] = 1 – (1080/2730) = 1 -.395 = .604 r >.55 , thus Evaluator 1 and QMOOD correlates Do this for each evaluator

  9. Conclusion • While there is room for arguments (especially the indexing scheme) • This model provided a way to define • OO Design Properties and • their associated metrics • This model provided a definition of quality attributes and associated to the design properties. • The model was tested out against several products and projects. • *** Need more study *** before we can claim victory!

More Related