1 / 30

A beginners guide to testing

A beginners guide to testing. Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii Honolulu HI 96822. Objectives. Understand the basic concepts, terminology, and approaches to software testing.

jerry-henry
Download Presentation

A beginners guide to testing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A beginners guide to testing Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii Honolulu HI 96822

  2. Objectives • Understand the basic concepts, terminology, and approaches to software testing. • Be able to apply this information to improve the quality of your own testing efforts during software development.

  3. Why is testing important?

  4. First National Bank (1996) • Problem: • An error at First National Bank of Chicago resulted in the balance of 800 customers being inflated by a total of $763 billion. • Reason: • Inadequate testing. Bank updated ATM transaction software with new message codes. Message codes were not tested on all ATM protocols, resulting in some ATMs interpreting them as huge increases to customer balances.

  5. Therac-25 (1985-1987) • Problem: • Six people were overexposed during radiation treatments for cancer by the Therac-25 radiation therapy machine. Three people were believed to have died from the overdoses. • Reason: • Inadequate testing. Hardware safety locks removed and replaced by software safety locks, which could be overcome by technician “type ahead”.

  6. Ariane 5 (1996) • Problem: • Ariane 5 rocket exploded on its maiden flight. • Reason: • Inadequate testing. Navigation package inherited from Ariane-4 without proper testing. New rocket flew faster, resulting in larger values of some variables, resulting in an attempt to convert a 64-bit floating number into a 16 bit integer. Code was caught and action taken was to shut down navigation system.

  7. ICS 613, Fall 2011 • Problem: • A student got a bad grade for ICS 613. • Reason: • Inadequate testing. The student did not learn how to write good test cases for their code.

  8. Testing fits intoValidation and Verification • Validation: • Establishing the fitness of a software product for its use. • “Are we building the right product?” • Requires interaction with customers. • Verification: • Establishing the correspondence between the software and its specification. • “Are we building the product right?” • Requires interaction with software.

  9. Static vs. Dynamic V&V • Static V&V: • Software inspections • Static analysis of source code • Control/data flow analysis • Dynamic V&V: • Defect testing • Looks for errors in functionality • Performance analysis • Looks for errors in scalability, performance, reliability

  10. Why is testing hard? • Exhaustive testing: • Execute program on all possible inputs and compare actual to expected behavior. • Could “prove” program correctness. • Not practical for any non-trivial program. • Practical testing: • Select a tiny % of all possible tests. • Goal: executing the tiny % of tests will uncover a large % of defects present! • A “testing method” is essentially a way to decide which tiny % to pick.

  11. How do we determine which tests to write? Selected examples: • Functional (black box) testing • Use specification to determine the tests • Structural (white box) testing • Use coverage to assess the quality of tests • Test-driven development • Write tests as specification • Use tests to determine the implementation! • Combinations of methods are acceptable!

  12. Black box testing

  13. Specification based testing • Specification: • A mapping between the possible “inputs” to the program and the corresponding expected “outputs” • Specification-based testing: • Design a set of test cases to see if inputs actually map to outputs. • Does not require access to source code • Differences with White Box (coverage) testing: • Can catch errors of omission. • Effectiveness depends upon the quality of specification

  14. Equivalence classes Goal: Divide the possible inputs into categories such that testing one point in each category is equivalent to testing all points in the category. Provide one test case for each point. • Equivalence class definition is usually an iterative process and goes on throughout development. Use heuristics to get you started designing your test cases.

  15. Unit test design heuristics • If input is a value: • maximum value • minimum value • empty value • typical value • illegal value • If input is a sequence: • Single element • Empty sequence • Max and min element values • Sequences of different sizes • Illegal elements • If I/O specification contains conditions: • true • false • If I/O specification contains iterations: • zero times • 1 time • > 1 time

  16. Web app design heuristics • Every page is retrieved at least once • Prevent 404 errors. • Every link is followed at least once. • Prevent 404 errors. • All form input fields are tested with: • Normal values • Erroneous values • Maximum/minimum values • Always check response for appropriateness.

  17. Tool support: JUnit

  18. Tool Support: JUnitReport

  19. White box testing

  20. Statement coverage • For a test case to uncover a defect, the statement containing the defect must be executed. • Therefore, a set of test cases which guarantees all statements are executed might uncover a large number of the defects present. • Whether or not the defects are actually uncovered depends upon the program state at the time each statement is executed.

  21. Control flow coverages • Control flow coverage adds conditions to statement coverage to raise the odds of discovering defects. • Branch coverage: • Every conditional is evaluated as both true and false during testing. • Loop coverage: • Every loop must be executed both 0 times and more than 1 time. • Path coverage: • All permutations of paths through the program are taken

  22. JaCoCo: A Java Coverage Tool

  23. JaCoCo: Drilldown to Package

  24. JaCoCo: Drilldown to Class

  25. JaCoCo: Red regions not tested

  26. “Coverage-driven test case design” • 1. Write some test cases to exercise your code in new ways. • 2. Run your coverage tool. • 3. If coverage is not 100%, go to 1.

  27. “Coverage-driven test case design” is bad! • Using coverage data to drive the design of your tests often results in a poorly designed test case suite. (See readings.) • Instead, use coverage to assess the quality of your test method. • In a nutshell: • Good test design method -> Good coverage • But not the other way around.

  28. Limitations of white-box testing • Can catch bugs in code that has been written. • Cannot catch bugs in code that has not been written! • Errors of omission: code that ought to have been written but wasn’t • Missing boolean conditions in IF statements • Missing exception handlers • To catch bugs in code that has not been written, you must compare behavior of program to its specification.

  29. Quality assurance in this class • You will use a combination of manual and automated quality assurance. • Automated QA: • Use Checkstyle, PMD, FindBugs, and eliminate all warnings. • Manual QA: • Use JUnit to develop tests, use Jacoco to check coverage. • Use Code Review to look for remaining errors.

More Related