1 / 89

CSDP Preparation Course Module V: Software Testing

CSDP Preparation Course Module V: Software Testing. Specifications. The exam specific topics covered in this module are listed below, and are the basis for the outline of its’ content. A. Types of Tests B. Test Levels C. Testing Strategies D. Test Design E. Test Coverage of Code

jessicaf
Download Presentation

CSDP Preparation Course Module V: Software Testing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSDP Preparation CourseModule V: Software Testing

  2. Specifications The exam specific topics covered in this module are listed below, and are the basis for the outline of its’ content. A. Types of Tests B. Test Levels C. Testing Strategies D. Test Design E. Test Coverage of Code F. Test Coverage of Specifications G. Test Execution H. Test Documentation I. Test Management Module V. Software Testing

  3. After completing this module, you should be able to… To determine the purpose of testing To identify the types of tests To identify the process and strategies for performing: Unit testing Integration testing System testing Acceptance testing To discuss test planning and other test documentation Objectives Module V. Software Testing

  4. Organization The organization of information for each specification topic is as follows: • Topic Content Slides - detail the important issues concerning each topic and support the module objectives • Topic Reference Slides - detail the sources for the topical content and provides additional references for review • Topic Quiz Slides - allow students to prepare for the exam Module V. Software Testing

  5. Introduction • Definition of Testing [SW04]: It is an activity performed for evaluating product quality, and for improving it, by identifying defects and problems. • Software testing consists of the dynamicverification of the behavior of a program on a finiteset of test cases, suitably selectedfrom the usually infinite executions domain, against the expected behavior. • Testing is currently considered that the right attitude towards quality is one of prevention: it is obviously much better to avoid problems than to correct them. Module V. Software Testing

  6. Introduction - 2 • Software Testing is the process of executing a computer program for the purpose of finding errors. • Software Testing is not -- A means of determining that errors are not present in the program -- A proof that the software performs as required -- A demonstration that all logic paths are executed -- The only means of verifying that the system meets the requirement specifications -- The sole means of demonstrating the quality of the software. Module V. Software Testing

  7. Introduction - 3 The purpose of Testing • Testing is a process of executing a program with the intent of finding errors -- A good test is one that has a high probability of finding undiscovered error -- A good test uses a minimum number of tests to find a maximum number of errors -- A successful test is one that finds undiscovered error • Testing cannot show the absence of defects; it can only show that software errors are present. Module V. Software Testing

  8. Introduction - 4 Major issues in Software Testing • Convincing stakeholders the thorough testing is vital to the success of a software project • Convincing project managers not to cancel or reduce when the project is running short on time or money • Convincing programmers that independent testing is worthwhile • Faulty software requirements (resulting in faulty functional testing) • Testing is time consuming and expensive (and tempting to reduce to save money and time) • Errors are often difficult to reproduce (particularly without proper test planning) • Fixing errors found during testing (should be fixed as a separate effort) • Lack of explicit test planning, test procedures, and test cases Module V. Software Testing

  9. Introduction - 5 Testing Principles • All functional tests should be traceable to software requirements • Tests should be planned at the beginning of the project (requirements phase) • The Pareto principles applies (80% of the errors will be found in 20% of the code) • Testing begins with software units and ends with systems tests • Exhaustive testing is not possible • Testing is more effective when conducted by a third party. Module V. Software Testing

  10. Introduction - 6 Software Testing Fundamentals • Testing-Related Terminology a) Definitions of Testing-Related Terminology [SW04, pp5-2] b) Faults vs. Failures [SW04, pp5-2] Module V. Software Testing

  11. Introduction - 7 • Key Issues a) Test selection criteria/Test adequacy criteria (or stopping rules) [SW04, pp5-3] b) Testing effectiveness/Objectives for testing [SW04, pp5-3] c) Testing for defect identification [SW04, pp5-3] d) The oracle problem [SW04, pp5-3] e) Theoretical and practical limitations of testing [SW04, pp5-3] f) The problem of infeasible paths [SW04, pp5-3] g) Testability [SW04, pp5-3] • Relationships of testing to other activities Module V. Software Testing

  12. Introduction References • [SW04] Guide to the Software Engineering Body of Knowledge –Chapter 5 Module V. Software Testing

  13. Introduction References - 2 LIST OF STANDARDS • (IEEE610.12-90) IEEE Std 610.12-1990 (R2002), IEEE Standard Glossary of Software Engineering Terminology: IEEE, 1990. • (IEEE829-98) IEEE Std 829-1998, Standard for Software Test Documentation: IEEE, 1998. • (IEEE982.1-88) IEEE Std 982.1-1988, IEEE Standard Dictionary of Measures to Produce Reliable Software IEEE, 1988. • (IEEE1008-87) IEEE Std 1008-1987 (R2003), IEEE Standard for Software Unit Testing: IEEE, 1987. • (IEEE1044-93) IEEE Std 1044-1993 (R2002), IEEE Standard for the Classification of Software Anomalies: IEEE, 1993. • (IEEE1228-94) IEEE Std 1228-1994, Standard for Software Safety Plans: IEEE, 1994. • (IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/IEC12207:1995, Industry Implementation of Int. Std. ISO/IEC 12207:95, Standard for Information Technology-Software Life Cycle Processes, vol. IEEE, 1996. Module V. Software Testing

  14. Introduction Quiz • Two terms often associated with testing are a) Faults and Failures b) Verification and Validation c) Alpha and Beta d) Functional and Structural Module V. Software Testing

  15. A. Types of Tests • The types of tests are: Unit testing Integration testing System testing Acceptance testing Module V. Software Testing

  16. A. Types of Tests - 2 V Model (Testing Model) Module V. Software Testing

  17. A. Types of Tests - 3 X Model (Testing Model) Module V. Software Testing

  18. A. Types of Tests - 4 Unit Testing • Unit testing is the testing of the smallest component of the software system, i.e., a unit or a module • Unit testing is normally done by the programmer (coder) who coded the module • Unit test may or may not be part of the test plan • Unit testing normally involves white-box testing • Unit testing can be replaced by inspections, peer reviews and argumentative. Module V. Software Testing

  19. A. Types of Tests - 5 • Unit testing normally includes -- Statement testing: Every statement in the module is executed at least once -- Branch testing: Every decision point in the code is executed at least once -- Loop testing: Every loop is repeated at least two times -- Path testing: Every distinct path through the code is executed at least once. Module V. Software Testing

  20. A. Types of Tests - 6 Verifying the Software Unit • Types of Verification Techniques: • Testing: The process of executing a computer program for the purpose of finding errors • Code Walkthroughs: A peer review for the purpose of finding errors • Code Inspections: A peer review for the purpose of finding errors (more formal than walkthroughs) • Code reviews: A management/technical review of a software project for the purpose of assessing progress • Formal proof techniques: Involves proving, using mathematical arguments, that a program is consistent with its specifications Module V. Software Testing

  21. A. Types of Tests - 7 Integration testing • Integration testing has two purposes: • Finding errors in implementing the design. • Finding errors in interfacing between components. • Tests are derived from the architectural design of the system Integration Testing Methods: • Bottom-up testing: The bottom components of the system are tested first – Requires component drivers • Top-down testing: The top components of the system are tested first – Requires component stubs • Big-bang testing: All components are tested at once – Also called a “smoke test” • Sandwich testing: Combines a top-down strategy with a bottom-up strategy – Requires the selection of a target area Module V. Software Testing

  22. A. Types of Tests - 8 System Testing • Functional testing: Do the system’s functional requirements perform as specified in the requirements specifications? • Performance testing: Do the system’s nonfunctional requirements perform as specified in the requirements specifications? (Nonfunctional requirements are: performance, external interfaces, design constraints, and quality attributes) • Stress testing: The use of input data that is equal to or exceeds the capacity of the system. • Regression testing: Selective retesting of a system or component to verify that modifications have not caused unintended effects and that the system or component still complies with its specified requirements. [IEEE Std 610.12-1990] Module V. Software Testing

  23. A. Types of Tests - 9 Functional Testing • Functional testing compares the system’s performance with its requirements • Functional testing does “end-to-end testing • Each function can be associated with those system components that accomplish it (sometimes called a thread) • The tests are normally done one function at a time • Functional tests should: -- Have a high probability of detecting a fault -- Use an independent testing team -- Know the expected actions and outputs -- Test both invalid and valid inputs -- Have a stopping criteria Module V. Software Testing

  24. A. Types of Tests - 10 Types of Performance tests Stress tests: Evaluate the system when all variables are at their extreme settings Volume tests: Evaluate the system for handling large amounts of data Timing tests: Evaluate the requirements to respond to a user and time to perform a function Recovery tests: Evaluate the system’s response to the presence of faults and the effects of failures Quality tests: Evaluate the requirements for such quality attributes as reliability, maintainability, availability, usability, security, etc. Module V. Software Testing

  25. A. Types of Tests - 11 Stress Testing • Modeled after the hardware stress testing in which excess voltage and excess speed are applied to determine what will break first, thereby showing the weak spots in the system. • In software, stress testing can involve: -- Adding more terminals to the system than it was original designed for -- Increasing the CPU cycle time beyond “normal” -- Increasing the capacity of data structures • This approach may reveal areas that are likely to fail under abnormal conditions Module V. Software Testing

  26. A. Types of Tests - 12 Regression Testing • Regression testing is a selective retesting of a system or component to verify that modifications have not caused unintended effects and that the system or component still complies with its specified requirements. [IEEE Std 610.12-1990] • A regression test identifies new results that have been introduced as old faults are corrected • It is primarily used after system maintenance is performed Module V. Software Testing

  27. A. Types of Tests - 13 Acceptance Test • A test run by the customer or on behalf of the customer to determine acceptance or rejection of the system • Four types of acceptance tests: -- Benchmark test : A set of test cases that reflect expected uses of the system -- Pilot test : A system installed on an experimental basis -- Alpha test : An in-house system test with the developing staff assuming the role of the user -- Beta test : An external system test with a select subset of the eventual users of the system Module V. Software Testing

  28. A. References • [SW04] Guide to the Software Engineering Body of Knowledge - Chapter 5 • [IEEE Std 610.12-1990] Module V. Software Testing

  29. A. Quiz • ________ refers to ensuring correctness from phase to phase of the software development cycle. a) Verification b) Validation c) Testing d) None of the above • _________ involves checking the software against the requirements a) Verification b) Validation c) Testing d) None of the above Module V. Software Testing

  30. B. Test Levels • Software testing is usually performed at different levelsalong the development and maintenance processes. That is to say, the target of the test can vary: a single module, a group of such modules (related by purpose, use, behavior, or structure), or a whole system. [SW04, pp5-3] • Three big test stages can be conceptually distinguished, namely Unit, Integration, and System. • Unit Testing: Unit testing verifies the functioning in isolation of software pieces which are separately testable. [SW04, pp5-3] Module V. Software Testing

  31. B. Test Levels - 2 • Integration testing is the process of verifying the interaction between software components. [SW04, pp5-4] • System testing is concerned with the behavior of a whole system. System testing is usually considered appropriate for comparing the system to the non-functional system requirements, such as security, speed, accuracy, and reliability. [SW04, pp5-4] Module V. Software Testing

  32. B. Test Levels - 3 • Objectives of Testing: Testing is conducted in view of a specific objective, which is stated more or less explicitly, and with varying degrees of precision. Stating the objective in precise, quantitative terms allows control to be established over the test process. [SW04, pp5-4] • The sub-topics listed below are some kinds of testing those most often cited in the literature. • Acceptance/qualification testing [SW04, pp5-4] • Installation testing [SW04, pp5-4] • Alpha and beta testing [SW04, pp5-4] Module V. Software Testing

  33. B. Test Levels - 4 • Conformance testing/Functional testing/Correctness testing [SW04, pp5-4] • Reliability achievement and evaluation [SW04, pp5-4] • Regression testing [SW04, pp5-4] • Performance testing [SW04, pp5-5] • Stress testing [SW04, pp5-5] Module V. Software Testing

  34. B. Test Levels - 5 • Back-to-back testing [SW04, pp5-5] • Recovery testing [SW04, pp5-5] • Configuration testing [SW04, pp5-5] • Usability testing [SW04, pp5-5] • Test-driven development [SW04, pp5-5] Module V. Software Testing

  35. B. References - 1 • [SW04] Guide to the Software Engineering Body of Knowledge - Chapter 5 Module V. Software Testing

  36. B. References - 2 LIST OF STANDARDS • (IEEE610.12-90) IEEE Std 610.12-1990 (R2002), IEEE Standard Glossary of Software Engineering Terminology: IEEE, 1990. • (IEEE829-98) IEEE Std 829-1998, Standard for Software Test Documentation: IEEE, 1998. • (IEEE982.1-88) IEEE Std 982.1-1988, IEEE Standard Dictionary of Measures to Produce Reliable Software IEEE, 1988. • (IEEE1008-87) IEEE Std 1008-1987 (R2003), IEEE Standard for Software Unit Testing: IEEE, 1987. • (IEEE1044-93) IEEE Std 1044-1993 (R2002), IEEE Standard for the Classification of Software Anomalies: IEEE, 1993. • (IEEE1228-94) IEEE Std 1228-1994, Standard for Software Safety Plans: IEEE, 1994. • (IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/IEC12207:1995, Industry Implementation of Int. Std. ISO/IEC 12207:95, Standard for Information Technology-Software Life Cycle Processes, vol. IEEE, 1996. Module V. Software Testing

  37. B. Quiz • ________ is associated with formal proofs of correctness a) Validation b) Verification c) Testing d) All the above • ________ is concerned with executing the software with test data. a) Validation b) Verification c) Testing d) All the above Module V. Software Testing

  38. C. Testing Strategies Test techniques • Based on the software engineer’s intuition and experience a) Ad hoc testing [SW04, pp5-5] b) Exploratory testing [SW04, pp5-5] • Specification-based techniques a) Equivalence partitioning [SW04, pp5-5] b) Boundary-value analysis [SW04, pp5-5] c) Decision table [SW04, pp5-6] d) Finite-state machine-based [SW04, pp5-6] e) Testing from formal specifications [SW04, pp5-6] f) Random testing [SW04, pp5-6] Module V. Software Testing

  39. C. Testing Strategies - 2 Test techniques • Code-based techniques a) Control flow-based criteria [SW04, pp5-6] b) Data flow-based criteria [SW04, pp5-6] c) Reference models for code-based testing (flow graph, call graph) [SW04, pp5-6] • Fault-based techniques [SW04, pp5-6] • Error guessing [SW04, pp5-6] b) Mutation testing [SW04, pp5-6] • Usage-based techniques a)Operational profile [SW04, pp5-7] b) Software Reliability Engineered Testing[SW04, pp5-7] Module V. Software Testing

  40. C. Testing Strategies - 3 Test techniques • Techniques based on the nature of the application[SW04,pp5-7] Object-oriented testing Component-based testing Web-based testing GUI testing Testing of concurrent programs Protocol conformance testing Testing of real-time systems Testing of safety-critical systems • Selecting and combining techniques [SW04,pp5-7] a) Functional and structural b) Deterministic vs. random Module V. Software Testing

  41. C. Testing Strategies - 4 Functional Testing • Functional Testing: Functional testing addresses itself to whether the program produces the correct output. -- Focuses on the functional requirements of the software, also called “black box testing” • Functional strategy uses only the requirements defined in the specification as the basis for testing • Attempts to find errors of the following categories -- Incorrect or missing functions -- Interface errors -- Errors in data structures or external database access -- Performance errors -- Initiation and termination errors • Bases the test on the external view of the system Module V. Software Testing

  42. C. Testing Strategies - 5 Structural Testing • Structural testing: The testing strategy is based on deriving test data from the structure of a system. The structural strategy is based on the detailed design. -- Focuses on the control structure of the system design, also called “white box or glass box testing”. • Focuses on -- Path testing: Exercising all independent paths within a module at least once -- Branch testing: Exercising all logical decisions on both their true and false sides -- Loop testing: Executing all loops at their boundaries and their operational bounds -- Exercising internal data structures to insure their correctness and availability. • Bases the test on the internal structures of the software Module V. Software Testing

  43. C. Testing Strategies - 6 Static Analysis • A testing technique that does not involve the execution of the software with data. It directly analyzes the form and structure of a product without executing the product. • Use of static analysis tools to scan the source text of a program and detect possible faults and anomalies. • Includes -- Program proving. -- Symbolic execution. -- Anomaly analysis Module V. Software Testing

  44. C. Testing Strategies - 7 Dynamic analysis • Dynamic analysis requires that the software be executed and relies on instrumenting the program to measure internal data and logic states as well as outputs. • The process of evaluating a program based on execution of the program • Involves execution or simulation of a development activity product to detect errors by analyzing the response of a product to sets of input data. • The software is exercised through the use of test cases • The resulting data is compared with the computed data to check for errors Module V. Software Testing

  45. C. Testing Strategies - 8 Dynamic Analysis • Dynamic functional: This technique executes test cases without giving consideration to the detailed design of the software. • Classified into: -- Domain testing -- Random testing -- Adaptive perturbation testing -- Cause-effect graphing Module V. Software Testing

  46. C. Testing Strategies - 9 Dynamic Analysis • Dynamic-Structural: This technique executes the test cases with creation of the test cases basing upon an analysis of the software. • Classified into: -- Domain testing -- Computation testing -- Automatic test data generation -- Mutation analysis Module V. Software Testing

  47. C. References • [SW04] Guide to the Software Engineering Body of Knowledge - Chapter 5 Module V. Software Testing

  48. C. References - 2 LIST OF STANDARDS • (IEEE610.12-90) IEEE Std 610.12-1990 (R2002), IEEE Standard Glossary of Software Engineering Terminology: IEEE, 1990. • (IEEE829-98) IEEE Std 829-1998, Standard for Software Test Documentation: IEEE, 1998. • (IEEE982.1-88) IEEE Std 982.1-1988, IEEE Standard Dictionary of Measures to Produce Reliable Software IEEE, 1988. • (IEEE1008-87) IEEE Std 1008-1987 (R2003), IEEE Standard for Software Unit Testing: IEEE, 1987. • (IEEE1044-93) IEEE Std 1044-1993 (R2002), IEEE Standard for the Classification of Software Anomalies: IEEE, 1993. • (IEEE1228-94) IEEE Std 1228-1994, Standard for Software Safety Plans: IEEE, 1994. • (IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/IEC12207:1995, Industry Implementation of Int. Std. ISO/IEC 12207:95, Standard for Information Technology-Software Life Cycle Processes, vol. IEEE, 1996. Module V. Software Testing

  49. C. Quiz • What kind of testing has been included under both structural and functional strategies a) Computation testing b) Domain testing c) Random testing d) None of the above Module V. Software Testing

  50. D. Test Design Attributes of a Test Design • Determine the features to be tested (or not tested) • Select the test cases to be used • Select the process to test the features • Determine the pass/fail criteria • Design the software test as soon as possible after the establishment of requirements. This helps -- Non-testable requirements to be found -- Quality to become built in -- Costs to be reduced -- Time saved Module V. Software Testing

More Related