1 / 76

PUTTING THEORY INTO PRACTICE: TEST ANALYST A two day workshop presented by Richard Ede

PUTTING THEORY INTO PRACTICE: TEST ANALYST A two day workshop presented by Richard Ede. Putting Theory Into Practice: Test Analyst. Course sections and key objectives: Day 1 Section 1: System development life cycle and introduction to the case study

Download Presentation

PUTTING THEORY INTO PRACTICE: TEST ANALYST A two day workshop presented by Richard Ede

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PUTTING THEORY INTO PRACTICE: TEST ANALYST A two day workshop presented by Richard Ede

  2. Putting Theory Into Practice: Test Analyst Course sections and key objectives: Day 1 Section 1: System development life cycle and introduction to the case study Become familiar with the system under test Section 2: The test process and test documentation Review and understand the test strategy and test plan Section 3: Low level test plan Create low level test plan Day 2 Section 4: Test analysis Create test objectives list from specification documentation Section 5: Test design Create test scripts to meet test objectives Section 6: Test schedule, test execution and test reporting

  3. Putting Theory Into Practice: Test Analyst Assessment and Certificate of Competence • The following documents are retained by the presenter for assessment • Low level test plan • Test objectives list • Test scripts • The work of each candidate is judged by the QBIT board of assessors • A certificate will be awarded to successful candidates • A merit or distinction will be awarded for excellent performance • Candidates are judged on quality, notquantity • All material will be returned

  4. Presentation 1: System development life cycle and introduction to the case study Contents 1.1 The system development life cycle 1.2 Activities defined in ISO 12207 1.3 The ‘V model’ and ISO 12207 1.4 Team structure 1.5 The team structure for the case study 1.6 Case study documentation 1.7 V, V & T and the 123 Review Exercise 1: Case study documentation

  5. 1.1 The system development life cycle ISO 12207: Software life cycle processes • Framework covering the conceptualisation of ideas through to their retirement • Also covers processes for control and improvement • It is a scaleable process • Purpose • establish a common framework • establish well defined terminology • define, control and improve the process • Field of application • acquisition, supply, development and maintenance of software systems • the relationship of the acquirer and the supplier • 5 activities of the acquirer • 7 activities of the supplier • 13 activities of the developer

  6. 1.2 Activities defined in ISO 12207

  7. 1.3 The ‘V Model’ and ISO 12207 Specify business requirements Acceptance & completion The acquirer The supplier System requirements analysis System qualification testing System architectural design System integration Software requirements analysis Software qualification testing Software architectural design Software integration As defined by ISO 12207 Software detailed design Software component testing Software coding

  8. 1.4 Team structure Specify business requirements Acceptance & completion The users (includes user acceptance test team) The acquirer The supplier System requirements analysis System qualification test team System qualification testing System design team Software configuration team System architectural design System integration Software requirements analysis Software qualification testing Software qualification test team Software design team Software architectural design Software integration Software detailed design Software Component testing Which team will you be in for this case study? Software development team (includes software test team) Software coding

  9. 1.5 Team structure for the case study 1.5.1 User acceptance test team Specify business requirements Acceptance & completion User acceptance test team System requirements analysis Test item transmittal report System architectural design

  10. 1.5 Team structure for the case study 1.5.2 System qualification test team Specify business requirements System requirements analysis System qualification testing System qualification test team System architectural design Test item transmittal report

  11. 1.6 Case study documentation 1.6.1 Documentation delivered by activities of the acquirer and supplier of the system required by test teams Business Requirements Specification Test material System Requirements Specification System Architecture Specification

  12. 1.7 V, V & T and the 123 Review 1.7.1 Verification • The process of evaluating a system or component to determine whether the products of the given development phase satisfy the conditions imposed at the start of that phase. (IEEE 610) • Is the product or document complete, consistent, unambiguous, accurate, compliant and to standard? • Have we constructed the product or document correctly?

  13. 1.7 V, V & T and the 123 Review 1.7.2 Validation • The process of evaluating a system or component during or at the end of the development process to determine whether it satisfies specified requirements (IEEE 610) • Check the product or document against previous documentation, for instance, the requirements, business analyses, risk assessment, quality requirements, system of change • Have we built the right thing for its specific intended use?

  14. 1.7 V, V & T and the 123 Review 1.7.3 Testing (certification) • The process of exercising software to verify that it satisfies specified requirements and to detect errors (BS 7925-1) • Analysing and deriving tests from previously verified and validated documentation in order that the product is certified to move onto the next stage of development • Does it do what we said it will do?

  15. 1.7 V, V & T and the 123 Review 2 Validation Business analysis Risk assessment Quality requirements System of change 3 Testing (Certification) What are we going to test and when Specification or design document LOW LEVEL TEST PLAN 1 Verification Complete? Consistent? Unambiguous? Accurate? Compliant To Standard? Testable? Traceable? TEST OBJECTIVES LIST TEST SCRIPT 1.7.4 The 123 Review

  16. Putting Theory Into Practice: Test Analyst Summary of presentation 1 • ISO 12207 describes a software life cycle process • 5 activities of the acquirer • 7 activities of the supplier • 13 activities of the developer • ‘V Model’ • Team structure • Specification documents required to create test material • Verification, validation and testing and the 123 Review

  17. Exercise 1: Case study documentation Objectives of exercise 1 • Become familiar with the system under test

  18. Presentation 2: The test process and test documentation Contents 2.1 A Generic Test Process (GTP) 2.2 Test status reporting 2.3 The test strategy 2.4 The test plan 2.5 Test documentation Exercise 2: The test strategy and the test plan

  19. 2.1 The Generic Test Process 2.1.1 ISO 12207 and the test process • ISO 12207 tells us to test and when to test • ISO 12207 does not tell us how to test • For this we need to define a test process • A Generic Test Process

  20. 2.1 The Generic Test Process (GTP) 2.1.2 The seven tasks of a Generic Test Process 7. TEST EXECUTION 6. TEST SCHEDULING 5. TEST DESIGN 4. TEST ANALYSIS 3. TEST PLAN 2. TEST STRATEGY 1. TEST STATUS REPORTING

  21. 2.2 Test status reporting 2.2.1 Test status reporting • Establishes references to all test objects and deliverables • Establishes the mechanism to enable traceability • Provides management the means for progress monitoring • Essential review of progress is required by management • Status reporting and traceability methods will be defined in the project strategy or test strategy

  22. 2.2 Test status reporting 2.2.2 Test status indicators test reference assigned 1. TEST STATUS REPORTING business objectives identified critical success indicators established test strategy defined 2. TEST STRATEGY 3. TEST PLAN test objects identified test objectives set objects and objectives prioritised attributes defined criteria / conditions specified criteria / conditions prioritised 4. TEST ANALYSIS 5. TEST DESIGN method identified method created data created test procedure created environment set up data assigned resources allocated pre-conditions met test ready to run 6. TEST SCHEDULING 7. TEST EXECUTION re-test needed regression test needed accept with qualifications accepted

  23. 2.3 The test strategy The test strategy defines: • Scope and approach to the testing work • Processes and activities to be undertaken • Critical issues and risks to be contained • Compliance to standards and company strategy including audit and product quality requirements • Commencement and completion criteria • Deliverables • Roles and responsibilities • Test status reporting • procedures for management reporting • procedures for problem or change management

  24. 2.4 The test plan The test plan defines: • High level test objectives • Test objects • Test environment • Tasks or activities needed to meet the test strategy and high-level test objectives • Identification of analysis, design, schedule and execution work to be done • Information required for scheduling, constraints, dependencies, critical paths, estimates of duration • Functional and non-functional attributes to be tested and which ones are not to be tested • It must also be traceable back to the test strategy

  25. 2.5 Test documentation PROJECT DOC SPEC 1. TEST STATUS REPORTING TEST STRATEGY 2. TEST STRATEGY TEST ITEM HIGH LEVEL TEST PLAN 3. TEST PLAN LOW LEVEL TEST PLAN LOW LEVEL TEST PLAN LOW LEVEL TEST PLAN LOW LEVEL TEST PLAN TEST OBJECTIVES LIST 4. TEST ANALYSIS 5. TEST DESIGN TEST SCRIPT 6. TEST SCHEDULING TEST SCHEDULE TEST ITEM TRANSMITTAL REPORT TEST EXECUTION 7. TEST EXECUTION TEST LOG TEST ITEM TRANSMITTAL REPORT TEST INCIDENT REPORT TEST SUMMARY REPORT

  26. Putting Theory Into Practice: Test Analyst Summary of presentation 2 • 7 tasks of the QBIT Generic Test Process • Test status reporting • Test strategy • Test plan • Test documentation

  27. Exercise 2: The test strategy and the test plan Objectives of exercise 2 • Review and understand the test strategy • Review and understand the test plan

  28. Putting Theory Into Practice: Test Analyst PROJECT DOC SPEC 1. TEST STATUS REPORTING TEST STRATEGY 2. TEST STRATEGY TEST ITEM HIGH LEVEL TEST PLAN 3. TEST PLAN LOW LEVEL TEST PLAN LOW LEVEL TEST PLAN LOW LEVEL TEST PLAN LOW LEVEL TEST PLAN TEST OBJECTIVES LIST 4. TEST ANALYSIS 5. TEST DESIGN TEST SCRIPT 6. TEST SCHEDULING TEST SCHEDULE TEST ITEM TRANSMITTAL REPORT TEST EXECUTION 7. TEST EXECUTION TEST LOG TEST ITEM TRANSMITTAL REPORT TEST INCIDENT REPORT TEST SUMMARY REPORT

  29. Presentation 3: The low level test plan Contents 3.1 Low level test plan 3.2 Test status indicators Exercise 3: The low level test plan

  30. 3.1 Low level test plan 3.1.1 Purpose of the low level test plan • The test strategy and test plan are project-wide • The low level test plan is specific to the current phase or part of the current phase of testing TEST STRATEGY HIGH LEVEL TEST PLAN SOFTWARE COMPONENT TEST PLAN SOFTWARE QUALIFICATION TEST PLAN SYSTEM QUALIFICATION TEST PLAN USER ACCEPTANCE TEST PLAN

  31. 3.1 Low level test plan 3.1.2 A low level test plan will contain information regarding: • Test plan identifier • References • Introduction • Test objects • Deliverables • Resources • Constraints • Test objectives • Tasks to be undertaken • A Gantt chart

  32. 3.5 Test status indicators Test status reporting • On completion of the low level test plan the test status indicators will be updated

  33. Putting Theory Into Practice: Test Analyst Summary of presentation 3 • Low level test plan • Test status indicators

  34. Exercise 3: The low level test plan Objectives of exercise 3 • Create the low level test plan

  35. Putting Theory Into Practice: Test Analyst PROJECT DOC SPEC 1. TEST STATUS REPORTING TEST STRATEGY 2. TEST STRATEGY TEST ITEM HIGH LEVEL TEST PLAN 3. TEST PLAN LOW LEVEL TEST PLAN LOW LEVEL TEST PLAN LOW LEVEL TEST PLAN LOW LEVEL TEST PLAN TEST OBJECTIVES LIST 4. TEST ANALYSIS 5. TEST DESIGN TEST SCRIPT 6. TEST SCHEDULING TEST SCHEDULE TEST ITEM TRANSMITTAL REPORT TEST EXECUTION 7. TEST EXECUTION TEST LOG TEST ITEM TRANSMITTAL REPORT TEST INCIDENT REPORT TEST SUMMARY REPORT

  36. Presentation 4: Test analysis Contents 4.1 The test objectives list 4.2 Traceability 4.3 Test status indicators Exercise 4: The test objectives list

  37. 4.1 The test objectives list 4.1.1 Test objectives • A measurable statement derived from documentation • Might start with “To show that… “ • Must be prioritised • The test objectives list describes the system under test from the tester’s point of view • The test objectives list might be part of the contractual agreement and could be signed off by • the project manager • the project sponsor

  38. 4.1 The test objectives list 4.1.2 Describing the system • Break higher level test objectives into lower level test objectives • Sub-divide functional areas into sub functions to create a hierarchy (some times called a functional decomposition) • Each function or test objective has associated non-functional attributes; these might have to be dealt with separately • Complete description of the system under test

  39. 4.1 The test objectives list 4.1.3 Functional and non-functional attributes • A function is what the system does • A non-functional attribute is how well it does it Examples of non-functional attributes: Load Storage Usability Documentation Volume Recovery Performance Maintainability Stress Reliability Installability Archiveability Security Sensitivity Serviceability Re-usability

  40. 4.2 Traceability 4.2.1 All items must be traceable • Test plans to test strategy • Test objectives to requirements • Test scripts to test objectives • Test results to test scripts • Issues to • test results • test scripts • test objectives • requirements

  41. 4.2 Traceability 4.2.2 Referencing System Requirements Specification Section 3.2.2 SYR.3.2.2 TO.1.2.3 To show that… TS.3 Test Objective Test Objective no. 1.2.3 Test Script no. 3

  42. 4.3 Test status indicators Test status reporting • On completion of the test objectives list the test status indicators will be updated

  43. Putting Theory Into Practice: Test Analyst Summary of presentation 4 • Test objectives list • functions and non-functional attributes • Traceability • Test status indicators

  44. Exercise 4: The test objectives list Objectives of exercise 4 • Analyse the specification documentation to derive test objectives • Organise test objectives into a coherent hierarchy • Reference test objectives so that they are traceable back to documentation and forward to test scripts • Prioritise test objectives based on business risk • Produce a test objectives list to be signed off by the test manager and business analyst

  45. Putting Theory Into Practice: Test Analyst PROJECT DOC SPEC 1. TEST STATUS REPORTING TEST STRATEGY 2. TEST STRATEGY TEST ITEM HIGH LEVEL TEST PLAN 3. TEST PLAN LOW LEVEL TEST PLAN LOW LEVEL TEST PLAN LOW LEVEL TEST PLAN LOW LEVEL TEST PLAN TEST OBJECTIVES LIST 4. TEST ANALYSIS 5. TEST DESIGN TEST SCRIPT 6. TEST SCHEDULING TEST SCHEDULE TEST ITEM TRANSMITTAL REPORT TEST EXECUTION 7. TEST EXECUTION TEST LOG TEST ITEM TRANSMITTAL REPORT TEST INCIDENT REPORT TEST SUMMARY REPORT

  46. Presentation 5: Test design Contents 5.1 The test script 5.2 Test design techniques: • Equivalence partitioning • Boundary value analysis • Classification trees for partition testing • State transition testing 5.3 Test status reporting Exercise 5: The test script

  47. 5.1 The test script 5.1.1 A test script should contain: • Test script identifier • References • Introduction • Identification of the test object • Platform • Commencement criteria • Completion criteria • Test objectives to be met • Test procedure (test steps necessary to meet the objectives) • input • expected outcome

  48. 5.1 The test script 5.1.2 For test script execution, a test script should also contain: • Date of script execution • Test environment • Test results (actual outcome) • Issues raised • References to all documentation generated as a result of running the test script

  49. 5.2 Test design techniques 5.2.1 Formal techniques • Test case techniques should be formally defined • either as defined in BS 7925-2 • or defined in a similar way to BS 7925-2 • Some examples include: • equivalence partitioning • boundary value analysis • classification trees for partition testing • state transition testing

  50. 5.2 Test design techniques 5.2.2 Equivalence partitioning • Input and output values of a component can be partitioned such that a single value can represent each partition • That single value is considered equivalent to all others in that partition • The partitions are derived from the specification continues…

More Related