1 / 15

High Level : Generic Test Process (from chapter 6 of your text and earlier lesson)

High Level : Generic Test Process (from chapter 6 of your text and earlier lesson). Test Planning & Preparation. Test Execution. Goals met?. NO. YES. EXIT. Analysis & Follow-up. Test Planning & Preparation (important but not always fully performed).

callie
Download Presentation

High Level : Generic Test Process (from chapter 6 of your text and earlier lesson)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. High Level: Generic Test Process(from chapter 6 of your text and earlier lesson) Test Planning & Preparation Test Execution Goals met? NO YES EXIT Analysis & Follow-up

  2. Test Planning & Preparation(important but not always fully performed) • Two important items in a test plan include: • Specifying Test Goals & Targets in terms of: • Reliability • Performance • Customer satisfaction • etc. • Set General Test Strategy whichincludes: • What & how much to test • Techniques of testing • Tools needed • Measurements and Analysis (details of) • People & Skills • Schedule Setting goals also sets the exit criteria & possibly guides the testing strategy

  3. Testing model • Part of Test Strategy is to decide on a testing model via: • Collecting information to address & choose a test model • Usage –based testing (info on customers’ actual usage pattern) • White-box testing (guided by black box testing results) • Mutation testing (programming language, past experience, etc.) • Analyzeand construct the appropriate test model & techniques • Validate the model and incrementally improve the model for continuous usage • From the Test model then develop test cases: • static test cases: whole/partial program; inputs; etc • dynamic test runs: whole/partial execution; artificial stop points; etc.

  4. Sample 1: Model from “past” data of your group Percentage Breakdown of Problem Discovery and Fixes 40% How would you use this for planning? 25% Problems Detected 15% 10% 10% customer reported Reviews Functional test Component test System test

  5. Sample 2: Model from “past” data of your group Percentage Breakdown of Problem Discovery and Fixes How would you use this for planning? Problems Detected 50% 30% 10% 10% test Technique 2 test Technique 3 customer reported test Technique 1

  6. Sample 3: Distribution of Error Prone AreasRelease 1 customer reported bugs How would you use this for planning Release 2 testing? Problems Detected F1 F3 F4 F5 F5 F2

  7. Test Suite and Test Procedure Preparation • Test case development is based on, besides test model, a broader functional area such as a usage-scenario which may be composed of several steps --- thus several related test cases needs to be prepared as a test suite • Test suites needs to be “managed” (with test mgmt tools) • Categorized • Stored • Reused • Test Procedure Preparation (especially important for component & system tests): • Sequencing of test case execution : based on some relationship (safety-hazard to avoid accidents, usage scenario, etc.) • Problem fix between related test cases : ensure timely fixes for a problem found in the middle of a sequence of test cases or test suite . Developers and testers need to be in synch --- no early release of developers

  8. Don’t Froget: Test Risks and Contingencies • Examples of Risks: • Changes to original requirements or design • Lack of available and qualified personnel • Lack of (or late) delivery from the development organization: • Requirements • Design • Code • Late delivery of tools and other resources • State the contingencies if the Risk item occurs • Move schedule • Do less testing • Change/relax the goals and exit criteria

  9. Sample: Recording Test Case & Test suites • Test Case number (use some scheme as your text book shows ) • Test Case author • A general description of the test purpose • Pre-condition: “includes test results of other pre-req modules” • Test inputs • Expected outputs (if any) • Post-condition • Test Case history: • Test execution date • Test execution person • Test execution result (s) : Pass Fail • If failed : Failure information : fix status

  10. Test Execution • Allocating • Test time • People resources and • Tools (test tools and test systems) • Execution • Load the system to be tested • Invoke and run the test cases (input data as needed) • Collecting test result data and measurements • Results analysis • Identify failure/success • Collect & Record failure information • Submit failed test cases to development (coder/designer) Actually, of preparation & execution

  11. Test Result Checking & Reporting • Basically comparing against expected test result as stated in the test case descriptions. • Some specific behavior or sequence may require one to refer back to requirements document or even user. (especially if the test case description is not complete) • “Good” problem descriptions can help in: • Clearer understanding of the problem • Aid in problem diagnosis and location of problem • Even possibly in problem fix • Non-functional problem such as “consistency or looks” may require experienced tester to analyze

  12. Collecting Test Measurements • In-project measurements ---- • helps us decide if we reached the goal and can stop testing • What and how much to test for system regression • Post project measurements --- helps classify • Defect types • Defect severity • Impact areas • etc. The measurements and analysis provide us with assessments of : - product reliability, - test coverage - etc.

  13. Test - Human Resources (QA Professional) • Test Planning (experienced people) • Test scenario and test case design ( experienced people and test discipline educated) • Test execution (semi-experienced to inexperienced) • Test result analysis (experienced people) • Test tool support (experienced people) • Test Organization: • Vertical model by each product (small organizations) • Horizontal model (large enterprises) • May include external people: • Users • Industry experts (for COTs )

  14. “Clearing a Career Path for Software Testers” byE. J. Weyuker, T.J. Ostrand, J. Brophy and R. Prasadin IEEE Software March/April 2000 • Describes a Software Test Professional Development Program in AT&T • Basic knowledge includes: computing systems and architecture, software development process, operating system, database & quality principles • Tester skills in: Career Path • Test planning - Test Apprenticeship • Test case design - Test Engineer (Mastery of some skills) • Test execution - Test Specialist (expert in some skill) • Test data adequacy analysis - Test Leader • Test automation - Test Manager or Enterprise Tester

  15. TestAutomation • Test case Generation – hardest area • Test plan and management – semi-automatic versions exist • Test execution – keystroke capture and replay has been popular for years • Test result analysis – some compare analysis to designate simple pass/fail • Test coverage – lots of tools Check out some test tools such as HP/Mercury and others --- via on-line search

More Related