1 / 45

Testing Overview

Testing Overview. CS 4311 Hans Van Vliet, Software Engineering, Principles and Practice, 3 rd edition, John Wiley & Sons, 2008. Chapter 13. 1. 1. Outline. V&V Definitions of V&V terms V&V and software lifecycle Sample techniques Testing Basics of testing Levels of software testing

jael-warner
Download Presentation

Testing Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Testing Overview CS 4311 Hans Van Vliet, Software Engineering, Principles and Practice, 3rd edition, John Wiley & Sons, 2008. Chapter 13. 1 1

  2. Outline • V&V • Definitions of V&V terms • V&V and software lifecycle • Sample techniques • Testing • Basics of testing • Levels of software testing • Sample testing techniques 2 2

  3. Verification and Validation (V&V) • Textbook use of term “Testing” • General/wider sense to mean V&V • Q: What is V&V in software testing? • Groups of 2 • What? Why? Who? Against what? When? How? • 5 minutes 3 3

  4. What is V&V? • Different use by different people, e.g., • Formal vs. informal and static vs. dynamic • Verification • Evaluation of an object to demonstrate that it meets its specification. (Did we build the system right?) • Evaluation of the work product of a development phase to determine whether the product satisfies the conditions imposed at the start of the phase. • Validation • Evaluation of an object to demonstrate that it meets the customer’s expectations. (Did we build the right system?)

  5. V&V and Software Lifecycle • Throughout software lifecycle, e.g., V-model

  6. Requirement Engineering • Determine general test strategy/plan (techniques, criteria, team) • Test requirements specification • Completeness • Consistency • Feasibility (functional, performance requirements) • Testability (specific; unambiguous; quantitative; traceable) • Generate acceptance/validation testing data

  7. Design • Determine system and integration test strategy • Assess/test the design • Completeness • Consistency • Handling scenarios • Traceability (to and from) • Design walkthrough, inspection

  8. Implementation and Maintenance • Implementation • Determine unit test strategy • Techniques (static vs. dynamic) • Tools, and whistles and bells (driver/harness, stub) • Maintenance • Determine regression test strategy • Documentation maintenance (vital!)

  9. Hierarchy of V&V Techniques V&V Dynamic Technique Static Technique Testing Symbolic Execution Formal Analysis Informal Analysis in narrow sense Static Analysis Walkthrough Model Checking Inspection Proof Reading Complementary

  10. Definitions of V&V Terms • “Correct” program and specification • Program matches its specification • Specification matches the client’s intent • Error (a.k.a. mistake) • A human activity that leads to the creation of a fault • A human error results in a fault which may, at runtime, result in a failure • Fault (a.k.a. bug) • The physical manifestation of an error that may result in a failure • A discrepancy between what something should contain (in order for failure to be impossible) and what it does contain • Failure (a.k.a. symptom, problem, incident) • Observable misbehavior • Actual output does not match the expected output • Can only happen when a thing is being used

  11. Definitions • Fault identification and correction • Process of determining what fault caused a failure • Process of changing a system to remove a fault • Debugging • The act of finding and fixing program errors • Testing • The act of designing, debugging, and executing tests • Test case and test set • A particular set of input and the expected output • A finite set of test cases working together with the same purpose • Test oracle • Any means used to predict the outcome of a test

  12. Where Do the Errors Come From? • Q: What kinds of errors? Who? • Groups of 2 • 3 minutes

  13. Where Do the Errors Come From? • Kinds of errors • Missing information • Wrong information/design/implementation • Extra information • Facts about errors • To err is human (but different person has different error rate). • Different studies indicate 30 to 85 errors per 1000 lines. After extensive testing, 0.5 to 3 errors per 1000 lines remain. • The longer an error goes undetected, the more costly to correct

  14. Types of Faults • List all the types and causes of faults: what can go wrong in the development process? • In group of 2 • 3 minutes

  15. Sample Types of Faults • Algorithmic: algorithm or logic does not produce the proper output for the given input • Syntax: improper use of language constructs • Computation (precision): formula’s implementation wrong or result not to correct degree of accuracy • Documentation: documentation does not match what program does • Stress (overload): data structures filled past capacity • Capacity: system’s performance unacceptable as activity reaches its specified limit • Timing: code coordinating events is inadequate • Throughput: system does not perform at speed required • Recovery: failure encountered and does not behave correctly

  16. Sample Causes of Faults Incorrector missing requirements Requirements Incorrect translation System Design Incorrect design specification Program Design Incorrect design interpretation Incorrect documentation Program Implementation Incorrect semantics Unit Testing Incomplete testing New faults introduced correcting others System Testing

  17. Sample V&V Techniques Requirements Reviews: walkthroughs/inspections Design Implementation Synthesis Testing Runtime monitoring Operation Model checking Correctness proofs Maintenance

  18. Outline • V&V • Definitions of V&V terms • V&V and software lifecycle • Sample techniques • Testing • Basics of testing • Levels of software testing • Sample testing techniques 18 18

  19. Question • How do you know your software works correctly?

  20. Question • How do you know your software works correctly? • Answer: Try it. • Example: I have a function, say f, of one integer input. I tried f(6). It returned 35. • Is my program correct? Groups of 2 1 minute

  21. Question • How do you know your software works correctly? • Answer: Try it. • Example: I have a function, say f, of one integer input. I tried f(6). It returned 35. • My function is supposed to compute x*6-1. Is it correct? • Is my program correct? Groups of 2 1 minute

  22. Goals of Testing I want to show that my program is correct; i.e., it produces the right answer for every input. Q: Can we write tests to show this? Groups of 2 1 minute

  23. Goals of Testing Can we prove a program is correct by testing? Yes, if we can test it exhaustively: every combination of inputs in every environment.

  24. How Long Will It Take? Consider X+Y for 32-bit integers. How many test cases are required? How long will it take? 1 test per second: 1,000 tests per second: 1,000,000 per second: Groups of 2 1 minute

  25. How Long? Consider X+Y for 32-bit integers. How many test cases are required? 232 * 232 = 264 =1019 (The universe is 4*1017seconds old.) How long will it take? 1 test per second: 580,000,000,000 years 1,000 tests per second: 580,000,000 years 1,000,000 per second: 580,000 years

  26. Another Example A loop returns to A. We want to count the number of paths. The maximum number of iterations of the loop is 20. How many? A B C

  27. Another Example Suppose the loop does not repeat: Only one pass executes 5 distinct paths A B C

  28. Another Example Suppose the loop repeats exactly once 5*5=25 distinct paths If it repeats at most once, 5 + 5*5 A B C

  29. Another Example What if it repeats exactly n times? 5n paths A B C

  30. Another Example What if it repeats at most n times? ∑5n = 5n + 5n-1+ … + 5 n=20, ∑5n = 1015 32 years at 1,000,000 tests per second A B C

  31. Yet Another Example • Consider testing a Java compiler? • How many inputs are needed to test every input?

  32. Limits of Testing You can’t test it completely. You can’t test all valid inputs. You can’t test all invalid inputs. You really can’t test edited inputs. You can’t test in every environment. You can’t test all variations on timing. You can’t even test every path. (path, set of lines executed, start to finish)

  33. Why Bother? Test cannot show an absence of a fault. But, it can show its existence!

  34. Goals of Testing Identify errors Make errors repeatable (when do they occur?) Localize errors (where are they?) The purpose of testing is to find problems in programs so they can be fixed.

  35. Cost of Testing • Testing accounts for between 30% and 90% of the total cost of software. • Microsoft employs one tester for each developer. • We want to reduce the cost • Increase test efficiency: #defects found/test • Reduce the number of tests • Find more defects • How? • Organize!

  36. A Good Test: • Has a reasonable probability of catching an error • Is not redundant • Is neither too simple nor complex • Reveals a problem • Is a failure if it doesn’t reveal a problem

  37. Outline • V&V • Definitions of V&V terms • V&V and software lifecycle • Sample techniques • Testing • Basics of testing • Levels of software testing • Sample testing techniques 37 37

  38. Levels of Software Testing • Unit/Component testing • Integration testing • System testing • Acceptance testing • Installation testing

  39. Levels of Software Testing • Unit/Component testing • Verify implementation of each software element • Trace each test to detailed design • Integration testing • System testing • Acceptance testing • Installation testing

  40. Levels of Software Testing • Unit/Component testing • Integration testing • Combine software units and test until the entire system has been integrated • Trace each test to high-level design • System testing • Acceptance testing • Installation testing

  41. Levels of Software Testing • Unit/Component testing • Integration testing • System testing • Test integration of hardware and software • Ensure software as a complete entity complies with operational requirements • Trace test to system requirements • Acceptance testing • Installation testing

  42. Levels of Software Testing • Unit/Component testing • Integration testing • System testing • Acceptance testing • Determine if test results satisfy acceptance criteria of project stakeholder • Trace each test to stakeholder requirements • Installation testing

  43. Levels of Software Testing • Unit/Component testing • Integration testing • System testing • Acceptance testing • Installation testing • Perform testing with application installed on its target platform

  44. Testing Phases: V-Model Requirements Specification System Specification System Design Detailed Design Acceptance Test Plan System Integration Test Plan Sub-system Integration Test Plan Unit code and Test Acceptance Test System Integration test Sub-system Integration test Service

  45. Hierarchy of Testing Testing Ad Hoc Program Testing System Testing Acceptance Testing Unit Testing Integration Testing Function Benchmark Properties Black Box Top Down Pilot Performance Equivalence Bottom Up Reliability Alpha Boundary Big Bang Availability Decision Table Beta Sandwich Security State Transition Usability Use Case Documentation Domain Analysis Portability White Box Capacity Control Flow Data Flow

More Related