1 / 16

Defect Removal Effectiveness Model

Defect Removal Effectiveness Model. Product Quality is Based on Many Items People Training Tools Activities Process Process should include Defect Prevention Activities Defect Removal Activities. Defect Prevention & Removal. Defect Prevention Activities

flo
Download Presentation

Defect Removal Effectiveness Model

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Defect Removal Effectiveness Model • Product Quality is Based on Many Items • People • Training • Tools • Activities • Process • Process should include • Defect Prevention Activities • Defect Removal Activities

  2. Defect Prevention & Removal • Defect Prevention Activities • Reuse of Proven, Existing Material • “Defensive” Software Activities: • Documented Requirements and Tracking • Business Process and Usage • User Background • Constraints • Documented Design and Tracking • Data Flow • Functions • Inter-related Structures • Error Conditions • Commented Source • Context Sensitive Help • Pre and Post Condition Statements in Source Code • Use Available Programming Constructs • Switch vs multiple ifs • Functions vs Multi-Parameter Subroutine Calls • Configuration and Change Management

  3. Defect Prevention & Removal • Defect Removal • Pre-coding • Reviews and Inspections • Coding • Reviews and Inspections • Unit Test • Post-coding • Functional Test • Component Test • System and Regression Test • Post-release • Customer Support and fixes

  4. Reviews and Inspections • Mostly Applied to Code and Pre-code Materials • Involves one or more people other than the author • Requires certain amount of preparation • Examining the material for completeness and correctness • Compare against the material from the previous activity(s) • Analyze the correctness of the result produced by the activity • Focused on discovering defects • Activity is static or non-execution oriented • Requires recording of problems found • Requires the follow up on fixes of the problems

  5. Testing • Applied to Machine Executable Material • Code • On-line Help • Messages and Information Boxes • User Choices and Defaults • Performed by Author(s) and Mostly Others • Has Several Major Steps • Test planning and preparation • Development of test scenarios and test cases • Running the tests • Recording the problems found and managing the fixes • Analysis of the test results

  6. More on Testing Requirements & Design Specs Executable Code, Help,Messages Etc. Test Scenarios & Test Cases How do these three sets interact and relate ? - size : amount of material - overlaps : coverage of specs by executables by tests

  7. White Box and Black Box Testing Executables more than specs Executables less than specs Test .VS. Test Use Black Box ? : Functionally Oriented without looking at the inside of the actual executables White Box ? : Coverage Oriented after looking at the inside of the actual executables .VS.

  8. Like to Get to State Executable Code, Help,Messages Etc. Requirements & Design Specs Test Scenarios & Test Cases Total overlap --- or as close to it as possible.

  9. A Test Case Example Test Case # : Purpose : Any Pre-Condition : Input(s) : Expected Output(s) : Any Post-Condition : Test Results : Test DateTest PersonActual ResultProblem Description Fix Status

  10. Defect Removal Activity model Number of errors introduced in this activity Number of defects found and removed in this activity Number of defects upon exit an activity Number of defects upon entering an activity How may we want to represent defect removal effectiveness (DRE) ? DRE = (# defects found & removed)/((#defects on entry)+(#defects introduced))

  11. A Complete Set of Defect Removal Activities Software Activities Hi-lev Design Lo-lev Design Code/ Unit T System Test Total Req Gath ……………. Defect Removal n n Req Insp 1,1 1. n n n Des Insp 2,1 2,2 2. . . . . . n C Test i,1 n n Sys Test j,1 J,k Total n N .1 Where n is # of error removed by removal i on activity j i,j

  12. Defect Removal Effectiveness Metrics Defect Removal Activity, i, Effectiveness or DRA E may be : i 1) n / (( Sum (n ) where j < or = i) - (Sum (n ) where m< or = i-1) ) i . . j m . 2) n / N i . Note that Customer Found Problems and “Unfound” Errors are not included in the metrics discussion.

  13. An Example with Numbers Software Activities Hi-lev Design Lo-lev Design Code/ Unit T System Test Total Req Gath ……………. Defect Removal 45 45 Req Insp 12 31 43 Des Insp . . . . . 24 4 C Test n Sys Test i,j Total 95 85 340 Where n is # of error removed by removal i on activity j i,j

  14. Numerical Example • High Level Design Inspection Effectiveness Metrics: • 1. 43 / ( (95+85) – (45) ) = 43/135 = .31 • 2. 43 / 340 = .126 • (*Remember not all potential defects are accounted for )

  15. Is There Any Possibility of Projecting Field Problems ? • Possibly with 2 “Big” Assumptions • Let IP = problems found in all the inspections and reviews • Let TP = problems found in all the tests • Let FP = problems found by customers • Let UP = defects never found as problems • Let TTP = all the defects of the software • So TTP = IP + TP + FP + UP • Assumption 1 : UP = 0 • So TTP = IP + TP + FP • Assumption 2 : Effectiveness of Inspections = Effectiveness of Tests • So IP/TTP = TP/(TTP-IP) • Or IP/TP = TTP/(TTP-IP) • Or IP/TP = (IP+TP+FP) /((IP+TP+FP)-IP) • Or IP/TP = (IP+TP+FP)/(TP +FP) • Or IP(TP + FP) = TP (IP+TP+FP) • Or IP*TP + IP*FP = TP*IP +TP*TP +TP*FP • Or IP*FP = TP*TP + TP*FP • Or FP(IP –TP) = TP *TP • Or FP = (TP * TP) / (IP – TP) **** a possible predictor of Post Release Quality ! • (but be careful when using this ---- note what happens when TP is greater than IP)

  16. More on the Predictor of FP • Consider a new term U = IP / TP • Or IP = U*TP • From the previous FP = (TP * TP)/ (IP –TP) • Or FP = (TP * TP) /((U*TP) –TP) • Or FP = (TP * TP) / (TP(U – 1)) • Or FP = TP / (U –1) *********** • But Remember the 2 Assumptions !!! • Use This With Great Caution !!! • Note what happens when IP is less than TP !?

More Related