1 / 23

Comprehensive Software Quality Assurance Guide

This detailed guide covers Requirements Engineering, Project Management, Software Design, QA Measures, Fault Prevention, Testing Phases & Approaches, Constructive vs. Analytic QA, and more. Understand QA principles, test methods, and how to ensure software quality. Learn about black-box testing, white-box testing, fault identification, and fault prevention strategies to enhance your software development process effectively.

Download Presentation

Comprehensive Software Quality Assurance Guide

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Contents • Introduction • Requirements Engineering • Project Management • Software Design • Detailed Design and Coding • Quality Assurance • Maintenance

  2. Quality Assurance • Introduction • Testing • Testing Phases and Approaches • Black-box Testing • White-box Testing • System Testing

  3. What is Quality Assurance? QA is the combination of planned and unplanned activities to ensure the fulfillment of predefined quality standards. • Constructive vs analytic approaches to QA • Qualitative vs quantitative quality standards • Measurement • Derive qualitative factors from measurable quantitative factors • Software Metrics

  4. Approaches to QA Usage of methods, languages, and tools that ensure the fulfillment of some quality factors. • Constructive Approaches • Syntax-directed editors • Type systems • Transformational programming • Coding guidelines • ... • Analytic approaches • Inspections • Static analysis tools (e.g. lint) • Testing • ... Usage of methods, languages, and tools to observe the current quality level.

  5. can lead to can lead to Fault vs Failure ?! • Different types of faults • Different identification techniques • Different testing techniques • Fault prevention and -detection strategies should be based on expected fault profile human error fault failure

  6. Documen- tation Specification/ requirements Environment/ support Design Code Other HP´s Fault Classification Fault origin: WHERE? Requirements or specifications Functionality HW interface SW interface User interface Functional description (Inter-)Process communications Data definition Module design Logic description Error checking Standards Logic Computation Data handling Module interface/ implementation Standards Test HW Test SW Integration SW Development tools Fault type: WHAT? Missing Unclear Wrong Changed Better way Fault mode: WHY?

  7. Fault Profile of a HP Division See [Pfleeger 98].

  8. Pre-implementation test Design specifications System functional requirements Other software requirements Customer requirements specification User environment Unit test Component code Tested component Unit test . . . Integration test Function test Performance test Acceptance (,) test Installation test Integrated modules Functioning system Verified software Accepted, validated system Tested component Unit test Component code SYSTEM IN USE! System test Testing Phases

  9. Pre-Implementation Testing • Inspections • evaluation of documents and code prior to technical review or testing • Walkthrough • In teams • Examine source code/detailed design • Reviews • More informal • Often done by document owners • Advantages • Effective • High learning effect • Distributing system knowledge • Disadvantages • Expensive

  10. G D C A B E F Integration Testing How to integrate & test the system • Top-down • Bottom-up • Critical units first • Functionality-oriented (threads) Implications of build order • Top-down => stubs; more thorough top-level • Bottom-up => drivers; more thorough bottom-level • Critical => stubs & drivers. Test all Test C Test B,E,F Test D,G Test F Test E Test G

  11. Testing vs “Proofing” Correctness • Verification • Check the design/code against the requirements • Are we building the product right? • Validation • Check the product against the expectations of the customer • Are we building the right product? • Testing • Testing can neither proof that a program is error free, nor that it is correct Testing is the process in which a (probably unfinished) program is executed with the goal to find errors. [Myers 76] Testing can only show the presence of errors, never their absence. [Dijkstra 69]

  12. Testing Principles • Construction of test suites • Plan tests under the assumption to find errors • Try typical and untypical inputs • Build classes of inputs and choose representatives of each class • Carrying out tests • Testers  implementers • Define the expected results before running a test • Check for superfluous computation • Check test results thoroughly • Document test thoroughly • Simplify test • Divide programs in separately testable units • Develop programs test friendly • Each test must be reproducible

  13. Test Methods • Structural testing (white-box, glass-box) • Uses code/detailed design to develop test cases • Typically used in unit testing • Approaches: • Coverage-based testing • Symbolic execution • Data flow analysis • ... • Functional testing (black-box) • Uses function specifications to develop test cases • Typically used in system testing • Approaches: • Equivalence partitioning • Border case analysis • ... develop black-box test cases develop white-box test cases perform white-box testing perform black-box testing time

  14. Test Preparation • Exhaustive testing is prohibited, because of the combinatorial explosion of test cases • Choose representative test data i paths to test #tests 1 X, Y 2 2 XX, XY, YX, YY 4 3 XXX, XXY, ... 8 ... 100 2100 for i := 1 to 100 do if a = b then X else Y; 2  2100 - 2 > 2,5  1030 With 106 tests/sec this would take 81016 years Choose test data (test cases)

  15. Test Case Development • Problems: • Systematic way to develop test cases • Find a satisfying set of test cases • Test case  test data • Test data: Inputs devised to test the system • Test case: • Situation to test • Inputs to test this situation • Expected outputs • Test are reproducible • Equivalence partitioning • Coverage-based testing

  16. Equivalence Partitioning Input data • Input- and output data can be grouped into classes where all members in the class behave in a comparable way. • Define the classes • Choose representatives • Typical element • Borderline cases Inputs causing anomalous behaviour System Outputs which reveal the presence of faults Output data class 1: x < 25 class 2: x >= 25 and x <= 100 class 3: x > 100 x  [25 .. 100]

  17. Statement Coverage Every statement is at least executed once in some test 4 2 7 6 1 8 3 5 2 test cases: 12467; 13567

  18. Branch Coverage For every decision point in the graph, each branch is at least chosen once 4 2 7 6 1 8 3 5 2 test cases: 12467; 1358

  19. Path Coverage Assure that every distinct paths in the control-flow graph is executed at least once in some test 4 2 7 6 1 8 3 5

  20. Path Coverage Assure that every distinct paths in the control-flow graph is executed at least once in some test 4 2 7 6 1 8 3 5 4 test cases: 12467; 1358; 1248; 13567

  21. Data Flow CoverageAll-uses coverage x :=1 x :=3 z :=x+y y :=2 r :=4 x :=2 z := 2*r z := 2*x-y Red path covers the defs y :=2; r :=4; x :=1 Blue path covers y :=2; x :=3. Does not cover x :=2

  22. Coverage-based Testing • Advantages • Systematic way to develop test cases • Measurable results (the coverage) • Extensive tool support • Flow graph generators • Test data generators • Bookkeeping • Documentation support • Disadvantages • Code must be available • Does not (yet) work well for data-driven programs

  23. Server Client Testing Tools / Support • Debuggers • Manual code instrumentation • Inspect/trace variables • ... • File comparators • E.g. for regression testing • Test-stub/-driver generators • Simulate client or server components, which are not yet available

More Related