1 / 43

Identifying and Using a Project’s Key Subprocess Metrics

Identifying and Using a Project’s Key Subprocess Metrics. Jeff S. Holmes BTS Fort Worth. Everyone Loves a Hero. Heroes Come Through!. Firemen – Saves Baby in Burning House Policemen – Catches Bad Guy Athlete – Hits Game Winning Homerun. Software Engineering Heroes!. All night coding!

emmet
Download Presentation

Identifying and Using a Project’s Key Subprocess Metrics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Identifying and Using a Project’s Key Subprocess Metrics Jeff S. Holmes BTS Fort Worth

  2. Everyone Loves a Hero

  3. Heroes Come Through! • Firemen – Saves Baby in Burning House • Policemen – Catches Bad Guy • Athlete – Hits Game Winning Homerun

  4. Software Engineering Heroes! • All night coding! • Debugging over the weekend! • THIS SHOULD NOT BE THE NORM!

  5. How To Minimize “Fire Drills”? • Preventative Actions • Proper wiring can prevent fires. • Locking your car can prevent theft. • Don’t get behind in the ball game. • Understand project status earlier.

  6. Metrics, metrics, everywhere… Req. Churn PCE/PSE KLOC/Hr CRUD Defect/KLOC Inspection Rates Schedule Adherence ODC Problem Backlog Packet Size

  7. But What is Really Important? • Customer Wants • Functionality • Zero Defects • On Time • What software metrics map to these? • How can we optimize these outputs?

  8. BTS Fort Worth Approach • Selected DMAIC to Improve Process • Identified Project with Two Years of Data • Performed Statistical Analysis • Conducted Pilot • Currently in “Control” Phase

  9. DMAIC : Define • Identify “what is important” • BTS FW Monitors • Productivity (KLOC/Hour)* • Quality (Post Release Defects/KLOC) • Schedule Adherence • These are BTS FW “Big X’s”

  10. DMAIC : Measure • The “Simple” View Software Development Life Cycle Perfect Software! Requirements Resources

  11. DMAIC : More Details Requirements Resources Requirements Perfect Requirements Resources Design Perfect Design Perfect Models Perfect Design Perfect Models Resources Code Perfect Code Perfect Code Resources Test Perfect Software!

  12. DMAIC : Subprocess Identification • BTS FW Identified Following Subprocesses • Planning Phase • Requirements Phase • Design Phase • Code Phase • Test Phase • Release Phase • Code Inspections

  13. DMAIC : Measured Data • BTS FW Uses Following Data: • # Requirements • # Developers on the project (Resources) • % Time in Planning • % Time in Requirements • % Time in Design • % Time in Code • % Time in Test • % Time in Release • Requirements Churn • Actual Size (KLOC) • Avg Defect Detection Rate (DDR) in Code Inspection

  14. DMAIC : Data Sources • BTS FW Data • DOORs for Requirements • Project Plans # Developers and % Times • ClearCase for Code Size • Inspection Database for DDR

  15. DMAIC : BTS Subprocess Metrics Resources Time in Phase Planning Req Count Time in Phase Requirements Req Churn Time in Phase Design Req Churn KLOC Time in Phase Code DDR Code Inspections Time in Phase Test Time in Phase Release

  16. DMAIC : BTS Subprocess Metrics Resources Time in Phase Productivity (KLOC/Hour) Planning ? Req Count Time in Phase Quality (PR Defects/KLOC) Requirements ? ? Schedule Adherence Req Churn Time in Phase Design Req Churn KLOC Time in Phase Code ? ? DDR Code Inspections Time in Phase Test Time in Phase Release

  17. DMAIC : BTS FW Analysis • Project Data • 8 Releases since 2002 • Similar work • “Stable” team • Used Step-wise Linear Regression to • Identify statistically significant factors • Develop prediction formulas for “Big X’s”

  18. CAUTION !! The following slides contain statistics that could be hazardous to your health! Persons who suffer from narcolepsy or “statisticitis” should consider leaving the room.

  19. DMAIC : Stepwise Linear Regression • Describes the relationship between one 'predicted' variable and 'predictor' variables • Goal – get the simplest equation with the best predictive power for • Productivity – KLOC/Hour • Quality – Post Release Defect/KLOC

  20. DMAIC : Standard Least Squares Model accounts for 99.82% of variance.

  21. DMAIC : Significant Effects • Most significant effects • % Time in Req • Average DDR • Interaction between % Time in Code and Average DDR < 0.05 is significant

  22. DMAIC : Standard Least Squares Model accounts for 90.62% of variance.

  23. DMAIC : Significant Effects • Most significant effects • % Time in Requirements • Interaction between % Time in Requirements and Requirements Churn < 0.05 is significant

  24. DMAIC : Statistically Significant • # Requirements • # Developers on the project • % Time in Planning • % Time in Requirements • % Time in Design • % Time in Code • % Time in Test • % Time in Release • Requirements Churn • Actual Size (KLOC) • Avg Defect Detection Rate (DDR) in Code Inspection

  25. Resources Time in Phase Productivity (KLOC/Hour) Planning ? Req Count Time in Phase Quality (PR Defects/KLOC) Requirements ? ? Schedule Adherence Req Churn Time in Phase Design Req Churn KLOC Time in Phase Code ? ? DDR Code Inspections Time in Phase Test Time in Phase Release DMAIC : Key Subprocess Metrics

  26. DMAIC : Variation Analysis • Prediction formulas generated to identify: • Good and bad variance • Most significant factors • NOTE: Prediction formula uses all effects from the models, not just the significant ones. Formula added in Percent Planning.

  27. DMAIC : Factor Weighting

  28. DMAIC : Factors’ Effects

  29. DMAIC : BTS FW Limits Green limit indicates direction a metric can deviate from the average and have desired results. Red indicates direction of undesired results.

  30. Subprocess Metrics Notes • Initial Data Left Much to be “Desired” • Despite Poor Data, the Analysis Identified: • Which Metrics and Processes Are Significant • Prediction Formulas Based on Project’s Data • Insight into Factors’ Effects • Limits for Monitoring the Factors

  31. DMAIC : Pilot Confirmation • Used prediction formulas on other projects • Compared project actuals vs. predicted. • Used historical data from 5projects. • Unable to compare predicted quality versus actual. (Predicted LOC/Hr) • These projects have not been in field long enough for CRUD to stabilize. • Interesting results found on predicted LOC/Hr.

  32. DMAIC : Predicted vs. Actual LOC/Hr • Projects A, B, and C projects had huge deviations. • Projects D and E were within 20%.

  33. DMAIC : Improve Performance? • So what? • How do you use this information? • Does Project Management have confidence in this analysis?

  34. DMAIC : Applying Analysis • More emphasis on statistically significant activities • Resulting in • Increased Productivity • On-Time Delivery • Desired Functionality Delivered • Improved Quality

  35. DMAIC : Agile Processes • BTS FW Adopted Agile Practices • Iterative Development • Prioritizes Requirements • Negates Requirements Churn • Pair Programming • Optimizes Coding and Inspection Time • Minimal Documentation • Moves effort from non-statistical activities.

  36. DMAIC : Agile Pilot Results • Productivity • 0.00291 KLOC/Hr • 20% improvement from 0.002399 • Inspection Defect Detection Rate • 1.18 Defects/Hr Detected • 48% improvement from 0.8 • Quality • 0 Post Release Defects!

  37. DMAIC : Agile Pilot Results • Customer Wants • Functionality – All functionality delivered • Zero Defects – No customer found defects • On Time – Product delivered 6 months early!

  38. DMAIC : Agile Monitoring • Monitor Iterations Not Phases • Refactoring Subprocess Monitoring • Two Agile projects in-work now

  39. DMAIC : Agile Monitoring • Monitoring • LOC per week • Defects caught per week by inspection • Defects caught per week by test • Time spent per week • Ratio of new work to correction work.

  40. DMAIC : Agile Metrics

  41. Summary • Save your “heroes” for real crises. • Understand subprocesses • Monitor subprocesses • Seek to optimize key subprocesses

  42. Recommendations • Examine current project data, it could prove to be very valuable! • Improve data capture on important data. • Use the data as a guideline, but experience can never be discounted.

  43. THANK YOU! Jeff S. Holmes Principal Staff Software Engineer Motorola Six Sigma Black Belt Fort Worth BTS Development Team Fort Worth, TX 817-245-7053 J.Holmes@Motorola.com

More Related