1 / 12

Test Cases

Test Cases. CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology March 13, 2007. Outline. Test design process Developing test cases. Test Design Process. Design tests elaborate test plan based on requirements Develop test cases based on knowledge of system behavior

cedricg
Download Presentation

Test Cases

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Test Cases CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology March 13, 2007

  2. Outline • Test design process • Developing test cases

  3. Test Design Process • Design tests • elaborate test plan • based on requirements • Develop test cases • based on knowledge of system behavior • Verify test cases • verify procedures • debug test cases

  4. 1. Design Tests • Define test objectives • what, not how • based on requirements • Define input specifications • files, databases • not keyboard or mouse input

  5. 1. Design Tests (continued) • Define test configuration for each test • operating system version • network configuration • Review test designs for coverage and accuracy • all requirements covered? • all inputs identified? • all configurations identified?

  6. 2. Develop Test Cases • Develop detailed test procedures • use black-box techniques like equivalence partitioningand boundary value analysis • specify actions and expected results • Define test setup and cleanup, noting inter-case dependencies • restart computer at start of testing • reset system to known configuration at start of each test

  7. Equivalence Partitioning • Divide the input into independent categories (partitions) • use the requirements specification • consider type of each input • Select a value from each category • Why is this a good idea?

  8. Cartoon of the Day (1/2)

  9. Cartoon of the Day (2/2)

  10. Boundary Value Analysis • For each input partition identified earlier: • pick values on the boundary • pick values on each side of the boundary • Why is this a good idea?

  11. 3. Verify Test Cases • Review test procedures • are the inputs unambiguous? • are the outputs unambiguous? • Debug test cases (during first run) • run the test as specified • note flaws in test case as well as flaws in software

  12. Summary: Test Design Process • Design tests • elaborate test plan • based on requirements • Develop test cases • based on knowledge of system behavior • Verify test cases • verify procedures • debug test cases

More Related