1 / 19

Ruthless Testing

Ruthless Testing. CSSE 514 Programming Methods 4/17/01. Overview. Test Early, Test Often, Test Automatically What to Test Unit Testing Integration Testing Validation and Verification Resource Exhaustion, Errors and Recovery Performance Testing Usability Testing How to Test

karl
Download Presentation

Ruthless Testing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Ruthless Testing CSSE 514 Programming Methods 4/17/01

  2. Overview • Test Early, Test Often, Test Automatically • What to Test • Unit Testing • Integration Testing • Validation and Verification • Resource Exhaustion, Errors and Recovery • Performance Testing • Usability Testing • How to Test • Regression Testing • Test Data • Exercising GUI Systems • Testing the Tests • Testing Thoroughly • When to Test • Final Note Reference: Andrew Hunt, David Thomas, The Pragmatic Programmer, Addison Wesley, 2000

  3. Test Early, Test Often, Test Automatically • Most developers hate testing • This often leads to avoiding the difficult areas of our code • Unit tests are likely to catch the smaller bugs while integrations tests often catch larger defects • Testing should start as soon as we have code • Automated tests that run with every build have a greater chance of success • The earlier in the development cycle that a bug is found, the cheaper it is to fix • Projects that take testing seriously can often have more test code than production code

  4. What to Test • There are several major types of software testing that you need to perform: • Unit testing • Integration testing • Validation and verification • Resource exhaustion, errors, and recovery • Performance testing • Usability testing

  5. Unit Testing • A unit test is code that exercises a module • Unit testing is a critical part of both refactoring and XP • Some principles of unit testing include: • Code a little, test a little, code a little, test a little... • Run your tests as often as possible, at least as often as you run the compiler. • Run all the tests in the system at least once per day (or night). • Begin by writing tests for the areas of code that you're most worried about breaking. • Write tests that have the highest possible return on your testing investment.

  6. Unit Testing • Principles of unit testing (cont.) • When you need to add new functionality to the system, write the tests first. • If you find yourself debugging using System.out.println(), write a test case instead. • When a bug is reported, write a test case to expose the bug. • The next time someone asks you for help debugging, help them write a test. • Don't deliver software that doesn't pass all of its tests

  7. Integration Testing • Involves showing that the major subsystems work together • Having good contracts between subsystems assists in the detection of integration issues • Often the single largest source of bugs in a system • Can be considered an extension of unit testing but on a larger scale

  8. Validation and Verification • Compares what the users say they want to what they need • Occurs after you have an executable in place • Asks the question: Does this system meet the functional requirements specified? • Is less concerned with defects than functional requirements • This is where we find how developer test cases compare to end-user access patterns

  9. Resource Exhaustion, Errors and Recovery • Need to discover how the system performs under real-world conditions • Some of the limits your code may encounter include: • Memory • Disk space • CPU bandwidth • Wall-clock time • Disk bandwidth • Network bandwidth • Color palette • Video resolution

  10. Performance Testing • Also known as stress testing • Put software under a load and see how it reacts • Measure against performance requirements • Transaction rates • Number of simultaneous users • Number of connections • Scalability • Other non-functional properties • Specialized testing hardware and/or software can simulate realistic loads

  11. Usability Testing • Performed with real users • Under real environmental conditions • Looks at human factors, including: • Were there any misunderstandings during requirements analysis that need addressing? • Is the software a natural extension of the users workflow? • As with validation and verification, usability testing should be performed as early in the life cycle as possible • May be necessary to bring in human factors specialists

  12. How to Test • After looking at what to test, it is important to spend time on how to test, including: • Regression testing • Test data • Exercising GUI systems • Testing the tests • Testing thoroughly

  13. Regression Testing • A regression test compares the output of the current test with previous known values • Need to know that the bugs you fixed today don't break things that were working yesterday • All the previous tests mentioned can be run as regression tests • Regression tests can be run to verify performance, contracts, validity, etc.

  14. Test Data • Real-world data • Comes from some actual source • Often collected from an existing system, a competitor's system or a prototype • Represents typical user data • Synthetic data • Artificially generated, perhaps under certain statistical constraints • May need synthetic data for any of the following reasons: • You need a lot of data • You need data to stress the boundary conditions • You need data that exhibits certain statistical properties

  15. Exercising GUI Systems • Often requires specialized testing tools • These tools can range from simple event capture/playback model to specially written scripts • Often less sophisticated tools enforce a high degree of coupling between the version of the software being tested and the test script itself • Decoupled code leads to more modular testing

  16. Testing the Tests • Test software is subject to the same imperfections as any software • After writing a test to detect a particular bug, cause the bug deliberately and make sure the test complains • Think of using a project saboteur • When writing tests, make sure that alarms sound when they should

  17. Testing Thoroughly • How do you know when you have tested the code base thoroughly enough? • Coverage analysis tools can help • These tools watch your code during testing and keep track of which lines of code have been executed and which haven't • Metrowerks Code Coverage Tool • Rational PureCoverage • Software Research TCAT/Java • More important than hitting each line of code is the number of states your program may have • Test state coverage, not code coverage

  18. When to Test • Testing is too important to leave to the last minute • Most testing should be done automatically • Test as frequently as possible • Some tests are harder to run on a frequent basis: • Stress tests • The tests that can't be automated should be on the schedule with the necessary resources allocated to the task

  19. Final Note • Find Bugs Once • If a bug slips through the net you need to add a new test to trap it next time • Once a human tester finds a bug, it should be the last time a human tester finds that bug • The automated tests should be modified to check for that particular bug from then on, every time, with no exceptions, no matter how trivial

More Related