1 / 19

Software Testing: A Brief Overview

Software Testing: A Brief Overview. Yih-Kuen Tsay Dept. of Information Management National Taiwan University Based on [Patton 2006, Kaner et al. 2002]. Infamous Software Error Cases. The Lion King Animated Storybook, 1994-1995 Didn’t run on untested but common systems

rollerj
Download Presentation

Software Testing: A Brief Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Testing:A Brief Overview Yih-Kuen Tsay Dept. of Information Management National Taiwan University Based on [Patton 2006, Kaneret al. 2002]

  2. Infamous Software Error Cases • The Lion King Animated Storybook, 1994-1995 • Didn’t run on untested but common systems • Intel Pentium Floating-Point Division Bug, 1994 • A bug that would hardly be encountered by common uses, but was mishandled by the company • NASA Mars Polar Lander, 1999 • A control bit to shut off the thrusters, which is triggered by “touched down”, but also by mechanical vibration. • The Y2K (Year 2000) Bug, circa 1974 • Two digits used to represent the year SDM 2015: Software Testing

  3. What Is a Software Bug (Error)? • The software doesn’t do something that the product specification says it should do. • The software does something that the product specification says it shouldn’t do. • The software doesn’t do something that the product specification doesn’t mention but should. • The software does something that the product specification doesn’t mention. • The software is difficulty to understand, hard to use, slow, or—in the software tester’s eyes—will be viewed by the end user as just plain not right. SDM 2015: Software Testing

  4. Real Causes of Software Bugs • Four categories: • Specification • Design • Code • Others • How frequent? specification > design > code • How costly to fix? code > design > specification SDM 2015: Software Testing

  5. What Does a Software Tester Do? • To goal of a software tester is to find bugs. • The goal of a software tester is to find bugs and find them as early as possible • The goal of a software tester is to find bugs, find them as early as possible, and make sure they get fixed. SDM 2015: Software Testing

  6. Some Axioms of Software Testing • It’s Impossible to test a program completely. • Software testing is a risk-based exercise. • Testing can’t show that bugs don’t exist. • The pesticide paradox: the more you test software, the more immune it becomes to your tests. • Not all the bugs you find will be fixed. • Software testers aren’t the most popular members of a project team. SDM 2015: Software Testing

  7. Lessons Learned: The Role of the Tester • Beware of testing “completely.” (L. 10) • You don’t assure quality by testing. (L. 11) • The assurance results from the effort of the entire team. • Never be the gatekeeper! (L. 12) • It is the responsibility of the people who control the project. • Beware of the not-my-job theory of testing. (L. 13) • Take a more expansive view. SDM 2015: Software Testing

  8. Testing Fundamentals • Black-box vs. white-box testing • In black-box testing, the tester only knows what software is supposed to do—he can’t look in the box to see how it operates. • In white-box (or clear-box) testing, the tester has access to the program’s code. • Static vs. dynamic testing (analysis) • Static testing refers to testing something that’s not running—examining and reviewing it. • Dynamic testing is simply to run and use the software (to see if it works as expected). SDM 2015: Software Testing

  9. Testing (Examining) the Specification • It is a static black-box testing • High-level techniques: • Pretend to be the customer • Research existing standards and guidelines • Review and test similar software • Low-level techniques: • Specification attributes checklist • Specification terminology checklist SDM 2015: Software Testing

  10. Good Well-Thought-Out Specifications • Complete • Accurate • Precise, unambiguous, and clear • Consistent • Relevant • Feasible • Code-free • Not constraining the underlying design, architecture, or code. • Testable SDM 2015: Software Testing

  11. Problem Words in a Specification (1/2) • Always, every, all, none, never • Look for exceptions • Certainly, therefore, clearly, obviously, evidently • Be aware of accepting something as a given • Some, sometime, often, usually, ordinarily, customarily, most, mostly • Too vague to test • Etc., and so forth, and so on, such as • Incomplete listing, not testable SDM 2015: Software Testing

  12. Problem Words in a Specification (2/2) • Good, fast, cheap, efficient, small, stable • Unquantifiable terms and not testable • Handled, processed, rejected, skipped, eliminated • Hidden functionalities • If … then … (but missing else) • Ask yourself what will happen if the “if” doesn’t happen SDM 2015: Software Testing

  13. Test Cases and Testing Techniques • To do black-box testing effectively requires some definition of what the software does. • Test cases are the specific inputs that you will try and the procedures that you will follow. • Selecting test cases is the single most important task that software testers do. • Testing techniques teach you how to select good test cases. SDM 2015: Software Testing

  14. Two Fundamental Approaches • Test-to-pass • Assure only that the software minimally works. • You don’t push its capabilities. • Test-to-fail • Design and run test cases with the sole purpose of breaking the software. • Try things that should force the bugs out. SDM 2015: Software Testing

  15. Testing Techniques • Equivalence Partitioning • To systematically get a smaller, but equally effective, set of test cases • Examples: • Addition of small numbers vs. large numbers • Filenames with valid characters vs. invalid characters • Short vs. long filenames • Divide test work along the two parts of software • The data • The program SDM 2015: Software Testing

  16. Data Testing • Boundary conditions • The first element of an array • Sub-boundary conditions • Powers of two, ASCII table • Default, Empty, Blank, Null, Zero, and None • Invalid, Wrong, Incorrect, and Garbage Data SDM 2015: Software Testing

  17. Program State Testing • Testing the software’s logic flow • Create a state transition map • Reduce the number of states and transitions to test • Testing the states to fail • Race condition and bad timing • Repetition, stress, and load SDM 2015: Software Testing

  18. Lessons Learned: Testing Techniques • People-based techniques focus on who does the testing. (L.49) • alpha testing, beta testing, eat your own dogfood, … • Coverage-based techniques focus on what gets tested. (L. 50) • Problems-based techniques focus on why you’re testing (the risks you’re testing for). (L. 51) • Activity-based techniques focus on how you test. (L. 52) • regression testing, exploratory testing, … • Evaluation-based techniques focus on how to tell whether the test passed or failed. • comparison with saved results, oracle-based testing, … SDM 2015: Software Testing

  19. Applying Your Testing Techniques • Configuration Testing • Compatibility Testing • Foreign-Language Testing • Usability Testing • Testing the Documentation • Testing for Software Security • Website Testing SDM 2015: Software Testing

More Related