AsiaSTAR 2001 KyouGyou KeiKaku (An Industry Project) - PowerPoint PPT Presentation

slide1 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
AsiaSTAR 2001 KyouGyou KeiKaku (An Industry Project) PowerPoint Presentation
Download Presentation
AsiaSTAR 2001 KyouGyou KeiKaku (An Industry Project)

play fullscreen
1 / 47
AsiaSTAR 2001 KyouGyou KeiKaku (An Industry Project)
65 Views
Download Presentation
annona
Download Presentation

AsiaSTAR 2001 KyouGyou KeiKaku (An Industry Project)

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. AsiaSTAR 2001KyouGyou KeiKaku(An Industry Project) Dr Clive Boughton Thom Larner Annette Vincent Masayoshi Kuroda INSPECTIONS

  2. DEFINITION An inspection is a • formal • rigorous • in-depth technical review intended to discover defects as soon as possible after they are introduced (Rakitin - 1997). Similar definitions are available from Gilb - 1988 and Sommerville - 1989. INSPECTIONS

  3. DEFINITION An error is a problem that is discovered during the phase in which it originates. A defect is a problem that is discovered beyond the phase in which it originates. It is better to eliminateerrorsbefore they becomedefects. INSPECTIONS

  4. OBJECTIVES • Find defects as early as possible in the software development process. • Reach agreement on extent of rework needed to redress defects. • Verify that completed rework meets predefined criteria. • Provide data on product quality and process effectiveness. • Increase technical knowledge of software teams. • Improve effectivenessof software validation. • Raise standards of software engineering. • Discover properties of software product and process that can be used to predict level of product quality. INSPECTIONS

  5. PROCESS MATURITY & INSPECTIONS INSPECT INSPECT INSPECT INSPECT REQUIREMENTS ANALYSIS SOFTWAREREQUIREMENTSSPECIFICATION (SRS) SOFTWARE DESIGN SOFTWAREARCHITECTURE DESIGN DOCUMENT (SADD) SOFTWARE DETAILED DESIGN DOCUMENT (SDDD) SOFTWARE CODE & UNIT TEST COMPILABLE UNIT SOFTWARE SOURCE CODE SOFTWARE INTEGRATION & TEST COMPILABLE SYSTEM SOFTWARE SOURCE CODE COMPILED SYSTEM TRACE TRACE TRACE TEST & EVALUATION PLANS UNIT, SYSTEM & ACCEPTANCE TEST SPECIFICATIONS FINAL UNIT TEST PROCEDURES & RECORDS FINAL SYSTEM TEST PROCEDURES & RECORDS INSPECTIONS

  6. PROCESS MATURITY & INSPECTIONS THE IMPOSSIBLE IDEAL INSPECT INSPECT INSPECT INSPECT REQUIREMENTS ANALYSIS SRS + NO ERRORS SOFTWARE DESIGN SADD + NO ERRORS & NO SRS DEFECTS SDDD + NO ERRORS & NO SRS DEFECTS SOFTWARE CODE & UNIT TEST SOURCE CODE + NO ERRORS & NO SADD, SDDD & SRS DEFECTS SOFTWARE INTEGRATION & TEST COMPILABLE SYSTEM SOFTWARE SOURCE CODE & COMPILED SYSTEM + NO ERRORS & NO SOURCE CODE SADD, SDDD & SRS DEFECTS TRACE TRACE TRACE TEST & EVALUATION PLANS UNIT, SYSTEM & ACCEPTANCE TEST SPECIFICATIONS FINAL UNIT TEST PROCEDURES & RECORDS FINAL SYSTEM TEST PROCEDURES & RECORDS INSPECTIONS

  7. PROCESS MATURITY & INSPECTIONS THE REALISTIC INSPECT INSPECT INSPECT INSPECT REQUIREMENTS ANALYSIS SRS + FEW ERRORS SOFTWARE DESIGN SADD + FEW ERRORS & FEWER SRS DEFECTS SDDD + FEW ERRORS & FEWER SRS DEFECTS SOFTWARE CODE & UNIT TEST SOURCE CODE + FEW ERRORS & FEWER SADD, SDDD & SRS DEFECTS SOFTWARE INTEGRATION & TEST COMPILABLE SYSTEM SOFTWARE SOURCE CODE & COMPILED SYSTEM + FEW ERRORS & FEWER SOURCE CODE SADD, SDDD & SRS DEFECTS TRACE TRACE TRACE TEST & EVALUATION PLANS UNIT, SYSTEM & ACCEPTANCE TEST SPECIFICATIONS FINAL UNIT TEST PROCEDURES & RECORDS FINAL SYSTEM TEST PROCEDURES & RECORDS INSPECTIONS

  8. PROCESS IMMATURITY & INSPECTIONS INSPECT THE WORST Do developers know the requirements? What does the code represent? What is being tested? What is being inspected? What are the errors? What are the defects? How is the system to be maintained? Why did this happen? SOFTWARE CODE & UNIT TEST SOURCE CODE + ALL ERRORS including REQUIREMENTS DEFECTS SOFTWARE INTEGRATION & TEST COMPILABLE SYSTEM SOFTWARE SOURCE CODE & COMPILED SYSTEM + ALL ERRORS including SOURCE CODE & REQUIREMENTS DEFECTS No SRS, SADD or SDDD POOR UNIT TEST PROCEDURES & POOR RECORDS POOR SYSTEM TEST PROCEDURES & POOR RECORDS INSPECTIONS

  9. INSPECTION REALISM (Fagan, Gilb) • 60% of defects exist before coding commences - they are introduced easily but removed with difficulty. • Inspection is about 80% effective in removing existing defects. • Testing is 50% - 55% (max.) effective in identifying & removing defects for a single test process. • Inspection is but one method to control quality - it needs to be used in conjunction with other static and also dynamic techniques. INSPECTIONS

  10. BENEFITS OF INSPECTIONS (Gilb) • Earlier delivery (25% - 35%). • Reduced development costs (25% - 35%). • Reduced maintenance costs (10 - 30 times less). • Much improved quality (10 - 100 times fewer defects). • Increased productivity (obvious from figures above). • Reduced testing & machine time (85% improvement). INSPECTIONS

  11. INSPECTION CHECKLISTS Example Source Code Inspection Checklist (see Rakitin) language dependent • Has design been implemented completely & correctly? • Are there missing or extraneous functions? • Is each loop executed the correct number of times? • Will each loop terminate? • Are all possible loop exits correct? • Will the program terminate without the need to abort? • Are all CASE statements evaluated as expected? • Is there any unreachable code? • Are there any off-by-one iteration errors? • Are there any dangling else clauses? • Is pointer addressing used appropriately & correctly? • Are pointer parameters used as values & vice versa? • Are boundary conditions considered (low level languages)? INSPECTIONS

  12. INSPECTION CHECKLISTS Example Source Code Inspection Checklist cont’d • Does number of input parameters match the number of arguments? • Do parameter & argument attributes match? • Do the units of parameters & arguments match? • Are any input arguments altered? • Are global variable definitions consistent across modules? • Are any constants passed as arguments? • Are any functions called from which there is no return? • Are returned VOID values used? • Are all interfaces correctly used as defined in the SADD/SDDD? • ........ And more!! INSPECTIONS

  13. JAVA CODE INSPECTIONS (Consortium) INSPECT We did not know the requirements! We did not know the design! We did not know test results! Traditional inspection - inappropriate! Not difficult to find simple coding errors! Difficult to find requirements defects! WHAT TO DO? DELIVERABLE SOFTWARE COMPILABLE SYSTEM SOFTWARE SOURCE CODE + ALL ERRORS & DEFECTS No SRS, SADD or SDDD, Test descriptions, Test results INSPECTIONS

  14. JAVA PACKAGE A JAVA PACKAGE B A DEPENDS on B B DEPENDS on A ?? JAVA PACKAGE C JAVA PACKAGE D ANATOMY OF Object-Oriented S/W SYSTEMS B DEPENDS on C & D JAVA PACKAGES MAIN RELATIONSHIP IS DEPENDENCY or USE JAVA PACKAGES typically contain many inter-related CLASSES INSPECTIONS

  15. JAVA PACKAGE A JAVA PACKAGE B JAVA PACKAGE C JAVA PACKAGE D ANATOMY OF Object-Oriented S/W SYSTEMS INTER-PACKAGE DEPENDENCY represents one or more class relationships across the inter-package boundary INSPECTIONS

  16. JAVA PACKAGE A JAVA PACKAGE B JAVA PACKAGE C JAVA PACKAGE D ANATOMY OF Object-Oriented S/W SYSTEMS Software architecture may be complex when one package depends on many other packages and the number of inter-package class relationships is also high. HIGH BANDWIDTH IS AN INDICATOR OF POOR STRUCTURE (from work done in Ada) INSPECTIONS

  17. ANATOMY OF Object-Oriented S/W SYSTEMS PACKAGES with HIGH BANDWIDTH • are (usually) internally complex • are typically difficult to change/maintain • typically do not match the design • are difficult to test adequately • contain many undiscovered errors/defects • often express a problem with the design INSPECTIONS

  18. JAVA CLASS A JAVA CLASS B USES JAVA CLASS C JAVA CLASS D JAVA CLASS E INHERITS JAVA CLASS F JAVA CLASS G JAVA CLASS H ANATOMY OF Object-Oriented S/W SYSTEMS Intra-package inter-class relationships in Java systems represent a more detailed view of a package. These inter-class relationships are also dependencies. INSPECTIONS

  19. JAVA CLASS A USES ….. JAVA CLASS B JAVA CLASS C JAVA CLASS D JAVA CLASS E JAVA CLASS ? ANATOMY OF Object-Oriented S/W SYSTEMS When a Java class depends on too many other classes then it is a clear indicator of class structural complexity. Dependency also includes inheritance - but the single inheritance properties of Java reduces the possibility of complex inheritance hierarchies. INSPECTIONS

  20. ANATOMY OF Object-Oriented S/W SYSTEMS CLASSES with MANY RELATIONSHIPS • are (usually) internally complex • can be difficult to change/maintain • typically do not match the design • are (usually) difficult to test adequately • can contain many undiscovered errors/defects • often express a problem with the detailed design INSPECTIONS

  21. ANATOMY OF Object-Oriented S/W SYSTEMS Intra-class method call. Inter-class method call. JAVA CLASS A ….. JAVA CLASS B JAVA CLASS C JAVA CLASS D JAVA CLASS E JAVA CLASS ? Inter-class relationships in Java systems are exemplified as method calls. Intra-class method calls are also possible. INSPECTIONS

  22. ANATOMY OF Object-Oriented S/W SYSTEMS Internal procedure of method may be complex. JAVA METHOD A Fan-out of method calls. ….. JAVA METHOD B JAVA METHOD C JAVA METHOD D JAVA METHOD E JAVA METHOD ? JAVA METHOD B JAVA METHOD C JAVA METHOD D When a Java class-method depends on too many other methods then it is a clear indicator of class-method procedural complexity. INSPECTIONS

  23. ANATOMY OF Object-Oriented S/W SYSTEMS METHODS with HIGH FANOUT • are (usually) internally complex • can be difficult to change/maintain • typically do not match the design well • are (usually) difficult to test adequately • can contain many undiscovered errors/defects • usually indicate a problem with the detailed design INSPECTIONS

  24. ANATOMY OF Object-Oriented S/W SYSTEMS The number of parameters indicates the level of data complexity. The number of decisions within the procedure of a method is a measure of procedural complexity. The number of calls a method makes is a measure of structural complexity. JAVA METHOD A ….. JAVA METHOD B JAVA METHOD C JAVA METHOD D JAVA METHOD E JAVA METHOD ? Research has established that there is usually a direct relationship between either data complexity and procedural complexity or structural complexity and procedural complexity INSPECTIONS

  25. THE NATURE OF JAVA SYSTEMS Is there a relationship between number of DEFECTS and • data complexity? • procedural complexity? • structural complexity? The hand inspection of Java code systems done by Software Improvements was aimed at determining such relationships. INSPECTIONS

  26. PASS RULES Program Analysis and Style Systems from SQI web-site (under Project Tools) INSPECTIONS

  27. PASS Rules Program Analysis and Style Systems from SQI web-site (under Project Tools) INSPECTIONS

  28. HEURISTICS INSPECTIONS

  29. OUR INSPECTIONS Data gathered during high-level inspections: Data gathered during detailed inspections: • Package name • Class name • File name • Libraries imported • Super classes extended • Interfaces implemented • PASS Rule & Heuristic violations • File length (lines) • Lines of code (non-comment, non-blank) • Total lines of comments • Actual lines of comments • Package name • Class name • File name • Libraries imported • Super classes extended • Interfaces implemented • PASS Rule & Heuristic violations • File length (lines) • Lines of code (non-comment, non-blank) • Total lines of comments • Actual lines of comments • Number of variables (including modifiers) • Methods defined • Method length (lines) • Method lines of code (non-comment, non-blank) • Method comments • Method actual comments • Number of parameters to method • Number of decisions made in method • Number of variables in method • Methods called by method INSPECTIONS

  30. HIGH-LEVEL INSPECTION SAMPLE DATA Table of statistics gathered during high-level inspections: INSPECTIONS

  31. HIGH-LEVEL INSPECTION SAMPLE DATA Table of PASS Rule violations gathered during high-level inspections: * Figures based on more files that shown here. INSPECTIONS

  32. SAMPLE DETAILED INSPECTION DATA Example project statistics derived from detailed inspections: INSPECTIONS

  33. SAMPLE DETAILED INSPECTION DATA Example of detailed class data from project database: INSPECTIONS

  34. SAMPLE DETAILED INSPECTION DATA Example fan-out data derived from detailed project database: Methods that call 10 or more other methods are candidates for close inspection. INSPECTIONS

  35. SAMPLE DETAILED INSPECTION DATA Decision point data derived from detailed project database: Methods that contain 10 or more decisions are candidates for close inspection. INSPECTIONS

  36. SAMPLE DETAILED INSPECTION DATA Decision count data derived from detailed project database: INTERESTING!! INSPECTIONS

  37. SAMPLE DETAILED INSPECTION DATA Decision count vs Method Size data derived from detailed project database: INSPECTIONS

  38. SAMPLE DETAILED INSPECTION DATA Fanout vs Parameters derived from detailed project database: INSPECTIONS

  39. SAMPLE DETAILED INSPECTION DATA Length vs Parameters derived from detailed project database: INSPECTIONS

  40. TOTAL RULE VIOLATIONS vs CLASS DEPENDENCIES INSPECTIONS

  41. Params Rule Violations 2.4 1 2.3 2 2.7 3 3.6 4 5.9 5 2 2 1.3 1 3.0 2 2.3 3 4.2 4 Params Rule Violations 2.6 1 1.8 2 1.5 3 1.3 4 Number of Parameters vs Number of Rule Violations Project 9 Project 3 Project 5 Project 7 INSPECTIONS

  42. Fanout Rule Violations 23 1 21 2 32 3 41 4 53 5 30 2 24 4 21 1 22 2 26 3 29 4 Fanout Rule Violations 23 1 67 2 93 3 91 4 Fanout vs Number of Rule Violations Project 9 Project 3 Project 5 Project 7 INSPECTIONS

  43. INSPECTION RATE DATA INSPECTIONS

  44. SAMPLE DETAILED INSPECTION DATA Decision point data derived from detailed project database P0009: Methods that contain 10 or more decisions are candidates for close inspection. INSPECTIONS

  45. SAMPLE DETAILED INSPECTION DATA Decision count data derived from detailed project database P0009: INTERESTING!! INSPECTIONS

  46. SAMPLE DETAILED INSPECTION DATA Length vs Fanout derived from detailed project database P0009: INSPECTIONS

  47. REFERENCES • Software Verification and Validation - A Practitioner’s Guide: Steven Rakitin (Artech House 1997). • Static Inspection - Tapping the Wheels of Software: Les Hatton (IEEE Software 1995). • Principles of Software Engineering Management: Tom Gilb (Adison-Wesley 1988). • Handbook of Walkthroughs, Inspections and Technical Reviews: Daniel Freedman and Gerald Weinberg (Dorset House 1990). • Measuring Software Design Quality: D.N.Card and R.L.Glass (Prentice-Hall 1990) • Software Engineering: Ian Sommerville (Adison-Wesley 1989). INSPECTIONS