1 / 88

How to Create a Test Strategy

How to Create a Test Strategy. Paul Gerrard paul@gerrardconsulting.com Twitter: @ paul_gerrard Web: gerrardconsulting.com. Agenda. Your test s trategy c hallenges? What is test strategy? Test strategy approach Test axioms as thinking tools Using the First Equation of Testing

nancya
Download Presentation

How to Create a Test Strategy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How to Create a Test Strategy Paul Gerrard paul@gerrardconsulting.com Twitter: @paul_gerrard Web: gerrardconsulting.com

  2. Agenda • Your test strategy challenges? • What is test strategy? • Test strategy approach • Test axioms as thinking tools • Using the First Equation of Testing • Testing in staged projects • Goals, risks and designing the test process • Goals, risks and coverage-based test reporting • Communicating test strategies • Case study exercise • Your challenges revisited (if we have time). • NB some slides are hidden and won’t be presented today. Intelligent Testing and Assurance

  3. Overview • This is a full day course (or 2 day, customised for clients) • I won’t talk over every slide • We might not get through the whole Case Study exercise in detail • But you WILL get some practice in many of the Q&A and conversations about test strategy.

  4. What Are Your Test Strategy Challenges?

  5. What Are Your Test Strategy Challenges? Ever written a test strategy that no one read?

  6. What is Test Strategy? Dig a Hole? Test a system?

  7. Test strategy answers questions • Before you can plan a test, you need to have a lot of questions answered • The strategy… • Presents some decisions that can be made ahead of time • Defines the process or method or information that will allow decisions to be made (in project) • Sets out the principles (or process) to follow for uncertain situations or unplanned events. Intelligent Testing and Assurance

  8. Test Strategy ‘evolution’ • Success-based: test to show it works • Defect-based: test to find bugs • Coverage-based: analyse requirements and code to achieve test coverage targets • Risk-based: use risk to focus testing and to inform risk assessment • Goal-based: use business goals and risk to focus testing and support decision-making. FOCUS: from programmer  tester  stakeholder Intelligent Testing and Assurance

  9. Who are the stakeholders? • Those who focus on risk and business goals: • Sponsors, project stakeholders • Business Users • Project management • Those who focused on contractual aspects, stage payments etc: • Software suppliers • Contracts people. Intelligent Testing and Assurance

  10. Who are the stakeholders? 2 • Those who focus on their risks and responsibilities: • Suppliers • Developers • System Test Team • UAT Team • Those who focus on meeting business and technical requirements: • Technical Architects • Operations • Technical Support. Intelligent Testing and Assurance

  11. Test Strategy and Approach Strategy is a thought process not a document

  12. Test Strategy ToC from 1994 1. Introduction 1.1. Version History 1.2. Purpose 1.3. Scope 1.4. Background 1.5. Assumptions and Dependencies 1.6. Summary 2. Overview of User Testing 2.1. Objectives 2.2. User Testing and the Overall Project 2.3. Functional Requirements Testing 2.4. The Need for Technical Requirements Testing 2.5. Starting Early - Front-loading 3. User Test Policy 3.1. Baseline for Testing 3.2. Contract 3.2.1. Acceptance Criteria 3.2.2. XXX and Supplier Responsibilities 3.2.3. Testing and QA 3.2.4. Quality Plan 3.3. Testing Criteria 3.4. Risk Criteria 3.5. Starting Criteria 3.6. Policy for Re-Testing 3.7. Policy for Regression Testing 3.8. Completion Criteria 3.9. Handover to Production 3.10. Documentation Plan 3.10.1. User Test Strategy 3.10.2. Test Plan 3.10.3. Test Log 3.10.4. Incident Log 3.10.5. Error Log 3.10.6. Test Report 4. Functional Requirements Testing 4.1. Approach 4.2. Process 4.3. Special Application Needs 5. Technical Requirements Testing 5.1. Usability testing 5.1.1. Requirements 5.1.2. Conducting Usability Tests 5.2. Performance testing 5.2.1. Requirements for Performance Testing 5.2.2. Performance Test Cases 5.3. Conversion Testing 5.4. Security testing 5.4.1. Security Tests 5.4.2. Security Test Cases 5.5. Documentation Testing 5.6. Volume testing 5.7. Stress testing 5.8. Storage testing 5.9. Recovery Testing 5.10. Installation testing 5.11. Reliability Testing 5.12. Serviceability Testing 5.13. Portability Testing 5.14. Tests Not Required by the Users 6. User Test Infrastructure 6.1. Test Environment 6.1.1. Support 6.1.2. Roles and Responsibilities 6.1.3. Test Environment 6.2. Tools and Automation 6.2.1. Comparators 6.2.2. Test Data Generators 6.2.3. Capture/Replay Tools 6.2.4. Testing Information Systems 6.2.5. Database Query and Maintenance Facilities 6.2.6. Transaction Simulators 7. Schedule 7.1. Milestone Plan 7.2. Activities to be Resourced 7.3. Skills Required 8. User Test Execution 8.1. Acceptance Test Procedure 8.1.1. Pre-Test Meeting 8.1.2. During the Test 8.1.3. Post-Test Meeting 8.2. Software Delivery 8.3. Testing to Plan 8.4. Handling Failures 8.5. Logging Tests 8.6. Error Classification 8.7. Controlling Releases of New Versions 8.8. Regression Testing 8.9. Documentation This is the table of contents of a 51 page document for an acceptance test of an outsourced development (written by me) A large safety-related system might have 150 pages of test strategy supported by 10-20 other risk-related documents “Does size matter?” Intelligent Testing and Assurance

  13. Contexts of Test Strategy Axioms Communication Early Testing Risks De-Duplication Test Strategy Opportunities Goals Automation Culture Contract User involvement Constraints Human resource Artefacts Skills Environment Process(lack of?) Timescales Intelligent Testing and Assurance

  14. Introducing theTest Axioms

  15. Test Axioms • Formulated as a context-neutral set of rules for testing systems • They represent the critical thinking processes required to test any system • There are clear opportunities to advance the practice of testing using them • Testers Pocketbook: testers-pocketbook.com • Test Axioms Website test-axioms.com Intelligent Testing and Assurance

  16. How can we use Test Axioms? • Test Axioms are not beginners guides • They can help you to think critically about testing • They expose flaws in other people’s thinking and their arguments about testing • They generate some useful by-products • They help you to separate context from values • Interesting research areas! • First Equation of Testing, Testing Uncertainty Principle, Quantum Theory, Relativity, Exclusion Principle... • You can tell I like physics Intelligent Testing and Assurance

  17. The Axioms arethinking tools

  18. The Axioms Stakeholder Basis Oracle Fallibility Scope Value Coverage Never-Finished Delivery Good-Enough Environment Repeat-Test Event Design Prioritisation Sequencing Intelligent Testing and Assurance

  19. The three axiom groups • Stakeholder • Value • Scope • Fallibility • Good-Enough • Delivery • Repeat-Test • Sequence • Environment • Event • Never-finished • Design • Basis • Coverage • Prioritisation • Oracle Intelligent Testing and Assurance

  20. Testing needs stakeholders (p64) Summary: Identify and engage the people or organisations that will use and benefit from the test evidence we are to provide Consequence if ignored or violated: There will be no mandate or any authority for testing. Reports of passes, fails or enquiries have no audience. Questions: • Who are they? • Whose interests do they represent? • What evidence do they want? • What do they need it for? • When do they want it? • In what format? • How often? Intelligent Testing and Assurance

  21. Test design is based on models (p68) Summary: Choose test models to derive tests that are meaningful to stakeholders. Recognise the models’ limitations and the assumptions that the models make Consequence if ignored or violated: Tests design will be meaningless and not credible to stakeholders. Questions • Are design models available to use as test models? Are they mandatory? • What test models could be used to derive tests from the Test Basis? • Which test models will be used? • Are test models to be documented or are they purely mental models? • What are the benefits of using these models? • What simplifying assumptions do these models make? • How will these models contribute to the delivery of evidence useful to the acceptance decision makers? • How will these models combine to provide sufficient evidence without excessive duplication? • How will the number of tests derived from models be bounded? Intelligent Testing and Assurance

  22. Test execution requires a known, controlled environment (p77) Summary: Establish the need and requirements for an environment and test data to be used for testing, including a mechanism for managing changes to that environment – in good time. Consequence if ignored or violated: Environments are not available in time or are unsuitable for testing. This will delay testing or cause tests to be run in the wrong environment and undermine the credibility of evidence produced. Questions • Who is responsible for the acquisition, configuration and support of test environments? • What assumptions regarding test environments do our test models make? • How will requirements for test environments be articulated, negotiated? • How will the validity and usability of test environments be assured? • How will changes to environments be managed, consistent with changes in requirements and other deliverables under test? • How will the state of environments, including backed up and restored versions be managed? Intelligent Testing and Assurance

  23. Creating a Test Strategy

  24. First Equation of Testing Axioms Context • Not an equation in the mathematical sense • Need to consider three key aspects and do a lot of thinking Values Thinking + Approach Intelligent Testing and Assurance

  25. One context, multiple approaches • Given context, practitioners can promote different approaches based on their values • Valuesare preferences or beliefs • Pre-planned v exploratory • Predefined v custom process • Requirements-driven v goal-based • Standard documentation v face-to-face comms. • Some contexts preclude certain practices • “No best practices” Intelligent Testing and Assurance

  26. Testing in Staged Projects The V-Model, W-Model and Goal-Based Approaches

  27. V-Model Requirements User Acceptance Test Is there ever a one-to-one relationship between baseline documents and testing? Functional Specification System Test Physical Design Integration Test Where is the static testing (reviews, inspections, static analysis etc.)? Program Specification Unit Test

  28. Work products • Project documents: • schedule, quality plan, test strategy, standards • Deliverables: • requirements, designs, specifications, user documentation, procedures • software: custom built or COTS components, sub-systems, systems, interfaces • infrastructure: hardware, O/S, network, DBMS • transition plans, conversion software, training... Intelligent Testing and Assurance

  29. What do we mean by testing? • Testing is the process of evaluating the deliverables of a software project • detect faults so they can be removed • demonstrate products meet their requirements • gain confidence that products are ready for use • measure and reduce risk • Testing includes: • static tests: reviews, inspections etc. • dynamic tests: unit, system, acceptance tests etc. Intelligent Testing and Assurance

  30. W-Model Acceptance Test Test the Requirements Install System Write Requirements System Test Test the Specification Build System Specify System Integration Test Test the Design Build Software Design System Write Code Unit Test

  31. W-Model and static testing RequirementsAnimation Acceptance Test Test the Requirements Install System Write Requirements Early TestCase Preparation ScenarioWalkthroughs System Test Test the Specification Build System Specify System Reviews Inspections Integration Test Test the Design Build Software Design System Write Code Unit Test StaticAnalysis Inspection

  32. W-Model and dynamic testing Acceptance Test Test the Requirements Install System Write Requirements BusinessIntegrationTesting SystemIntegrationTesting System Test Test the Specification Build System Specify System PerformanceTesting UsabilityTesting Integration Test Test the Design Build Software Design System SecurityTesting BoundaryValue Testing EquivalencePartitioning Exploratory Testing Write Code Unit Test PathTesting

  33. It’s usually more complicated:A real high level test plan Programme managed

  34. Goals • The fundamental business objectives of the system(s) to be built, implemented and used • The benefits of undertaking the project • The payoff(s) that underpin and justify the project • Risks are what threaten the goals of a project. Intelligent Testing and Assurance

  35. Goal Based Test Strategy • The test strategy must set out how: • Achievements (the goals) of a project are evidenced or demonstrated • The risks that threaten goals will be explored, re-assessed and deemed acceptable (or not) • We need to understand the goals and how achievement will be measured • We need to understand (in particular, product) risk and how they are explored and exposed. Intelligent Testing and Assurance

  36. A goal network (aka results chain or logic model) Every project has a network of dependent interim and ultimate goals threatened by risks The ultimate business goal Your strategy will identify the test activities that will measure goal achievement and evidence these risks GOAL RISK Intelligent Testing and Assurance

  37. Introduction to Risk

  38. The definition of risk • Italian dictionary: Risicare, “to dare” • Simple generic definition: • “The probability that undesirable events will occur” • In this tutorial, we will use this definition: “A risk threatens one or more of a project’s goals and has an uncertain probability” Intelligent Testing and Assurance

  39. Three types of software risk • Project Risk • resource constraints, external interfaces, supplier relationships, contract restrictions • Process Risk • variances in planning and estimation, shortfalls in staffing, failure to track progress, lack of quality assurance and configuration management Primarily a management responsibility Planning and the development process are the main issues here. • Product Risk • lack of requirements stability, complexity, design quality, coding quality, non-functional issues, test specifications. Testers are mainly concerned with Product Risk Requirements risks are the most significant risks reported in risk assessments.

  40. Risk response planning • Do nothing! • Pre-emptive risk reduction measures • information buying • process model • risk influencing • contractual transfer • Reactive risk reduction measures • contingency plans • insurance • But this all sounds highly theoretical – we could never get this to work in my company! Where testing fits in Intelligent Testing and Assurance

  41. Even penguins know how to manage risk!

  42. Goals and Risks and Designing the Test Process

  43. Test activities overlay the goal network (not all goals in scope) Test Phase/Activity GOAL RISK Intelligent Testing and Assurance

  44. Requirements HL Design Tech Design Prog. Spec. Code Sub-System System Risks, deliverables and test types • Walkthrough • Review • Inspect • Prototype • Early test preparation • Unit Test • Static analysis • Integration Test • System Test • Acceptance Test Goal/Risk • Non-functional • Security • Performance • Usability • Backup/recovery • Failover/restart • Volume • Stress • Etc. etc. Intelligent Testing and Assurance

  45. Goal/Risk Test Objective Sub-System Testing Goal/Risk Test Objective Technique Goal/Risk Test Objective Technique Goal/Risk Test Objective Technique Goal/Risk Test Objective System Testing Goal/Risk Test Objective Technique Goal/Risk Test Objective Technique Risks, objectives and test stages Intelligent Testing and Assurance

  46. Testing Uncertainty • Planning relies on predictions of the future but how can you predict test status at a future date? • The answer is … you can’t • The Testing Uncertainty Principle: • One can predict test status, but not when it will be achieved; • One can predict when a test will end, but not its status. Intelligent Testing and Assurance

  47. Acceptance Criteria

  48. Acceptance Criteria • Represents the overall readiness to commit to going live considering: • The readiness of the solution • The readiness of the business • Ability to implement (and rollback, if necessary) • To live with the difficulties of early days • To support the system in it’s early days • The need to be compliant • Here’s a generic, but comprehensive set of Acceptance Criteria for a Large Programme. Intelligent Testing and Assurance

  49. Level 1 Criteria (example) Intelligent Testing and Assurance

  50. Level 2 Criteria (example) Intelligent Testing and Assurance

More Related