1 / 23

Measure Quality on the Way In – Not Just on the Way Out

Measure Quality on the Way In – Not Just on the Way Out. Author: Jan Fish Email: jan.fish@philips.com Division: Philips Lifeline IT September, 2008. For Your Consideration. Traditional Measurements for Test Organizations Value Added Opportunities 4 Phase Approach

Download Presentation

Measure Quality on the Way In – Not Just on the Way Out

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measure Quality on the Way In – Not Just on the Way Out Author: Jan Fish Email: jan.fish@philips.com Division: Philips Lifeline IT September, 2008

  2. For Your Consideration • Traditional Measurements for Test Organizations • Value Added Opportunities • 4 Phase Approach • Target versus Actual Progress • Bug Patterns within Builds / Deployments • Workload Assessment for Outstanding Bugs • Bug Injection Points and Bug Removal Points • Measure Quality on the Way In; Not Just on the Way Out Philips Lifeline J. Fish

  3. Traditional Measurements for Test Organizations • TEST METRICS: • Standard of measurement • Gauge effectiveness and efficiency • Gathered and interpreted throughout the test effort • Objectively measure success • PHILOSOPHY: • Keep it simple • Make it meaningful • Track it • Use it Philips Lifeline J. Fish

  4. Traditional Measurements for Test Organizations BASE METRICS may include numbers and/or percentages for: - Test cases created - Test cases in review - Test cases to be executed - Test cases re-executed - Test cases executed - Total executes - Test cases passed - Total test cases passed - Test cases failed - Total test cases failed - Test cases blocked - Defect removal cost - Bad test cases - Bad fixes - Defects corrected - Test effectiveness (QA / QA + Prod) Philips Lifeline J. Fish

  5. Value Added Opportunities • Forecast, Track and Respond: • Inputs: Number of Test Cases, Testers and Cycle Time • First Run Failure Rate: • Inputs: Failed Test Cases / Executed Test Cases • What is the Pattern: • Inputs: Known Patterns vs. Current Pattern • Workload Distribution: • Inputs: Number and Type of Bugs in DEV, QA and Resolved Bug Injection and Removal Points: • Inputs: Error Created and Error Found • Plan It, Track It and Graph It: • Inputs: Actual Progress to Targeted Goal Philips Lifeline J. Fish

  6. Target vs. Actual – Existing Project Philips Lifeline J. Fish

  7. Target Planned vs. Passed vs. Failed Test Cases Philips Lifeline J. Fish

  8. New and Improved Target versus Actual Philips Lifeline J. Fish

  9. New and Improved Chart Philips Lifeline J. Fish

  10. Target vs. Actual Math • Target Test Cases Executed math = Target Test Cases Passed + Target Test Cases Failed • Actual Test Cases Executed math = Actual Test Cases Passed + Actual Test Cases Failed • Pending Retests math = Previous Pending Retests + Actual Test Cases Failed - Retests Done • Running Total math = Actual Values + Previous Reported Values • Forecast Failure Rate math = Target Test Case Failed / Target Test Cases Executed • Actual Failure Rate math = Actual Test Case Failed / Actual Test Cases Executed • Forecast % Done math = Current Target (Passed + Failed) / Final Target Test Cases Executed • Actual % Done math = Current Actual (Passed + Failed) / Final Target Test Cases Executed Philips Lifeline J. Fish

  11. The Added Values of Target vs. Actual TRACK to plan and REACT immediately • Is quality built in • Is the build / deployment truly ready for test • Is the test resource on schedule with each type of test execution • Functional • Regression • Load / Performance / Security • Bug Fix Validation Philips Lifeline J. Fish

  12. The Added Values of Target vs. Actual PREDICT how many test cases should be run in a given time period To date, our First Run Failure Rates are: • 7% - 12% maintenance of existing functionality • 20% - 35% added functionality to existing Application • 30% - 47% new Application developed on-site • 10% - 25% new Application developed off-site Philips Lifeline J. Fish

  13. The Added Values of Target vs. Actual ADJUST time and / or staff on an immediate basis • Substantiate “Gut Feel” • Demonstrate facts • Moderate re-work and know if it fits the schedule • Adjust the plan and determine the level of effort well before the last quadrant of the test cycle Philips Lifeline J. Fish

  14. The Added Values of Target vs. Actual SET realistic, track-based entrance criteria (as agreed upon with upstream partners) for what is or is not acceptable quality at the start of the project USE the results to document expectations into project contracts (internal or external) • Establish what constitutes ”acceptable quality level” • Set contract conditions for scaled fees based on exceeding, meeting or failing quality objectives Philips Lifeline J. Fish

  15. The Added Values of Target vs. Actual PUBLISH and POST printed copy in a common area • Eliminate the arguments • Eliminate the negative chatter • Document facts; don’t point fingers • Grid each application/area Failure Rates in like-sets • Review to determine likely process improvements • Continue to track implemented improvement and assess if it had a positive or negative impact Philips Lifeline J. Fish

  16. Bug Pattern Recognition – Sample 1 Philips Lifeline J. Fish

  17. Pattern Recognition – Existing Project 2 Philips Lifeline J. Fish

  18. The Added Values of Pattern Recognition INSTITUTE a process change and know the effect INCREASE the precision of estimates IDENTIFY trendsandobservationssupporting or hindering test and project plan INFORM team and management of patterns found in a crisp, clean and simple manner such that the whole team can make “next step” decisions Philips Lifeline J. Fish

  19. Philips Lifeline J. Fish

  20. The Added Values of Bug Reporting by Group • WORKLOADS can be easily identified by management and team members • INFORMED DECISIONS can be made on next steps without drilling into details (who, when, what, how and where) • RE-WORK can be tracked and compared to planned work • PROGRESS toward resolving outstanding bugs Philips Lifeline J. Fish

  21. Bug Injection/Removal Points Philips Lifeline J. Fish

  22. Added Values of Bug Injection & Removal Points • BASIS for estimating the number of bugs to be found at each phase of the Software Development Lifecycle • IDENTIFIED work cycles that would benefit from process improvements and inspection points • VALIDATE that improvements work • DEMONSTRATE that quality must be the goal of all team members, not just the responsibility of the test organization “Quality is never an accident; it is always the result of intelligent effort.” * *John Ruskin, English writer and critic of art, architecture and society. 1819 - 1900 Philips Lifeline J. Fish

  23. Measure Quality on the Way In; Not Just on the Way Out • PEOPLE do not intentionally make a bad plan but they may not be able to quickly adjust the plan to current circumstances and abate risks • PEOPLE do look for paths of least resistance and, if it is easy to blame others, they will • MEASURING quality once product is in production tells the tale of what was not trapped and fixed but not the tale of quality before production • QUALITY can not be tested into a product but you can measure quality coming into your test organization • FOCUS on the level of quality at the project level Philips Lifeline J. Fish

More Related