1 / 12

CSC 395 – Software Engineering

CSC 395 – Software Engineering. Lecture 32: Even More Metrics -or- More Ways To View A Project. In This Lecture. Examine metrics and why important Review metrics from last lecture Evaluate which ones are most used and why Theorize proper use of these metrics

Download Presentation

CSC 395 – Software Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSC 395 –Software Engineering Lecture 32: Even More Metrics -or-More Ways To View A Project

  2. In This Lecture • Examine metrics and why important • Review metrics from last lecture • Evaluate which ones are most used and why • Theorize proper use of these metrics • More touchy-feely manager issues • Conventional wisdom on middle management does not examine life as non-management • Example of real metric collection and analysis

  3. Midterm Part 2 Cofee Talk • What should go in a use-case? • What starts this program? What ends it? • How should event flow states be connected? • Is software engineering worth it? • Holy Roman Empire was neither holy, nor Roman, nor an Empire

  4. Code Monkeys • Do not want to trust code monkeys • Often untrained & unskilled programmers • Also includes new graduates starting their first job • Internal candidate pool for engineer positions • Nobody and nothing is perfect • At implementation start, design contain faults • Code monkeys must find faults • Often will just fix the small ones • Good monkeys will clean & optimize the ugly design bits • Managers must identify & nurture these diamonds

  5. NASA Metrics Suite http://satc.gsfc.nasa.gov/support/STC_APR98/apply_oo/apply_oo.html • Cyclomatic Complexity (CC) • Lines of Code (LOC) • Comment Percentage (CP) • Weighted Methods per Class (WMC) • Response For a Class (RFC) • Lack of Cohesion Of Methods (LCOM) • Coupling Between Objects (CBO) • Inheritance Tree Depth (DIT) • Number Of Children (NOC)

  6. Objectives For Each Metric

  7. Weighted Methods per Class • Most classes are (relatively) simple • Complex classes must be tested heavily • Should be examined for potential revision

  8. Response For a Class • Most methods have few methods called • Higher values require complex & extensive testing • Also good candidates for revision

  9. Response For a Class • Coupling is just plain bad • Used by lazy, less experienced, & incompetent

  10. Using Multiple Metrics • Combining metrics identifies problem regions

  11. Inheritance is Mixed Bag • Higher DIT increases reuse • Also makes code more complex & harder to maintain • Additional metric could show if inheritance is good

  12. For Next Lecture • Continue looking at metrics • Changing hows, whens, & whys of measurement • Will be discussing chapter 9 of the book • Do your reading • Looking at how to plan and measure project • Programmers universally under estimate needs • Important when bidding on contracts, charging for changes, & other business functions

More Related