1 / 33

Software Testing 101

Software Testing 101. By Rick Clements cle@cypress.com. Overview. Terminology Requirements Configuration control Test plan Test cases Test procedures Bug tracking & ship decision After the testing. Terminology. QA vs. QC.

Download Presentation

Software Testing 101

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Testing 101 By Rick Clements cle@cypress.com

  2. Overview • Terminology • Requirements • Configuration control • Test plan • Test cases • Test procedures • Bug tracking & ship decision • After the testing

  3. Terminology QA vs. QC • Quality assurance (QA) - The management of quality. The encouragement of best practices. QA is about preventing defects. • Quality control (QC) - Validation and verification of the product. This includes reviews, inspections and testing.

  4. Terminology Test Documents • Test plan (or validation plan) - The plan describing the general approach, the people required, equipment required and schedule for testing and other validation activities • Test cases - Specific data used to test the product. • Test procedures - The procedures to follow in testing the product. These may be manual or automated.

  5. Terminology Types of Tests • Black box testing - Testing by looking at the requirements to develop test cases. System level testing is often black box testing. • White box testing - Testing by looking at the structure of the program to develop test cases. White box testing often occurs in unit testing.

  6. Terminology Levels of Testing • Unit testing - The process of running small parts of the software. The design teams often handle unit testing. • Integration testing or Interface testing - The testing of pre-tested modules and hardware to determine that they work together. • System testing - The testing of the entire product to see that it meets its requirements. This occurs after the product has been integrated.

  7. Software Development Process Software Implementation & Debug Software Design Define Requirements Software Test Test Implementation & Debug Test Design Software Release

  8. Requirements • Why are they important? • How are they documented? • What’s important? • What if you don’t have any requirements?

  9. Requirements Why Are Requirements Important? • How do the designers know what to build? • How can you test that it was built correctly? • When you disagree, who’s right?

  10. Requirements Documenting Requirements • Data sheet • Requirements specification • Functional specification • Data base • Traceability matrix

  11. Requirements What’s Important • They exist • They are unambiguous and testable • Cover all of the customers (not just final customer) • They are under configuration control

  12. Requirements No Documented Requirements • Ask the project manager • Ask the marketing representative • Has anything been sent to the customer? • Ask a domain expert • What are the designers building? • Write them down

  13. Configuration Control • Why is it a testing issue? • What to track • Build & Version Number

  14. Configuration Control Why Is It A Testing Issue? • Ship the version that was tested • A single test system failing • Modules accidentally reverting to older version • Re-create software and tests

  15. Configuration Control What To Track • Requirements • Software • Hardware • Tests

  16. Configuration Control Version & Build Number • Simple to generate • Unique for each build or change • Readily visible and validated for correctness

  17. Test Plan • What will be tested and not tested? • What approach will be taken? • What resources are needed? • Which people are needed? • Schedule

  18. Test Cases • Test boundary conditions • System interfaces • Where have other errors been found? • What will the users do most often? • What is the most serious if it fails? • Usability • What is unique to your product's environment?

  19. Test Cases Boundary Conditions • Values at and away from the boundaries • Valid and invalid values • Both numeric values and strings • Minimum-1, minimum, maximum, maximum+1, a good middle value

  20. Test Cases Where Have Errors Been Found? • Errors tend to cluster • Can a common error be fixed? • Would a code review be useful?

  21. Test Cases Usability • Often over looked • First to see software outside of design • The interface makes sense if you know the design • Need to know your users

  22. Test Cases Interfaces • User interface • Interfaces between software modules • Interfaces to other programs • Hardware / software interfaces

  23. Test Cases What Will Users Do Most Often? • Frequently encountered errors impact the user more • Test heavily used areas more heavily • Less used areas can’t be ignored

  24. Test Cases What Failures Are Most Serious? • Areas data could be lost • Errors with a large financial impact • Errors that could injure someone

  25. Test Cases Unique to Web Applications • Versions of browsers • Different operating systems • Server capacity • Multiple servers - one fails?

  26. Test Cases Unique to GUI Based Application • System configuration changes • Size of the screen • Number of colors available • Default font type and size

  27. Test Cases Unique to Database Applications • Compatible with existing data • Testing on a copy of real data • Server capacity

  28. Test Cases Unique to Embedded Applications • Can multiple simultaneous stimulus be handled? • Are hardware failures handled correctly? • Running out of supplies • Components breaking or degrading • Communications errors between components • Can temperature changes cause the system to behave differently?

  29. Bug Tracking & Ship Decision • Bug states • Bug Information • Is the software ready?

  30. Bug Tracking & Ship Decision

  31. Bug Tracking & Ship Decision

  32. After the Testing • Known problems and solutions • Defects not fixed • Shorten customer service leaning curve • Test report • Tuned to the audience • Introduction & conclusion • Major defects • What was run on which versions • What tests couldn’t be run

More Related