1 / 27

User Acceptance Testing The Hard Way

User Acceptance Testing The Hard Way. Graham Thomas BCS SIGIST 10 th May 1996. CONTENTS. Background Test Method Test Environment Test Execution Implementation Measures of Success Lessons Learnt. BACKGROUND. The Project Project Structure The Environment Start Point. The Project.

johana
Download Presentation

User Acceptance Testing The Hard Way

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. User Acceptance TestingThe Hard Way Graham Thomas BCS SIGIST 10th May 1996

  2. CONTENTS • Background • Test Method • Test Environment • Test Execution • Implementation • Measures of Success • Lessons Learnt

  3. BACKGROUND • The Project • Project Structure • The Environment • Start Point

  4. The Project • Link 3 computer systems • Sales & Marketing • Registration • Billing • In 3 separate business areas • With 3 different product lifecycles • Supported by 3 competing suppliers

  5. The Environment

  6. Project Structure

  7. Start Point

  8. TEST METHOD • Method • Test Planning • Test Analysis • Test Scripting • Data Definition

  9. Method

  10. Test Planning • Plans • Pre-determined end date • Stress & volume testing required • Re-usable test environment to be built • Users want to see bills produced • Resources • 2 testers for 10 weeks • 1 strategist for 10 days

  11. Test Planning (2) • Proposed Strategy • Structured testing - driven by User Requirements Spec. • Involve User Representatives • Data Tidy & User Procedures to be in place for test execution • Build a regression test environment • Extra time required • Additional resource required

  12. Test Analysis • Requirements Spec • A technical document • Not understood by users • Not understood by testers • Technical Design Spec’s. • Written by individual suppliers • Difficult to interpret without access to system design docs.

  13. Test Analysis (2) • Requirements Spec rewritten in English • 200+ requirements extracted • Workshopped business scenarios • Business scenarios reviewed by suppliers

  14. Test Scripting • Legacy systems had a lack of design documentation • Design documentation for enhancements not delivered • No one had knowledge of how all three systems would interface • Management only interested in the number of scripts, not their content

  15. Test Scripting (2) • Mgmt. view that Test Team could not‘Cut the mustard’ • Suppliers view‘only they could test their systems’ • Brought members of suppliers’ development teams on board • Suppliers not paid until completion of testing

  16. Data Definition • User Representatives limit their involvement to a review capacity • Pragmatic decisions taken to: • Generate test data from limited set supplied by User Reps. • Satisfy more than one requirement with a single script • Reported this as a risk through to the Project Board

  17. TEST ENVIRONMENT • Determine requirements • Specify environment • Then Get Real ! • Complete copy of production data for all three systems • Beg, borrow and steal ! • ‘Virtual Environment’

  18. TEST EXECUTION • Problems • Problems, problems, problems . . . • Resource Requirements • Progress Monitoring

  19. Problems • Delayed by late delivery of Code • Incident Reporting System Required • Test Harness didn’t work • Project Board intervention required to bring User Reps. back ‘On Side’ and commit more of their time • Changes !

  20. More Problems • Additional testers but no accommodation, hardware or software • Systems Integration found wanting • System not stable enough to benefit from automation tools • Short term planning !

  21. Resource Usage

  22. Progress Monitoring

  23. Progress Monitoring (2)

  24. IMPLEMENTATION • Roll out plan • Three Days • Round Clock • Multi-site co-ordination • Power outage • Tape drive failure • Unforeseen system interaction

  25. MEASURE OF SUCCESS • Objectives met • Suppliers view • Users change operating practice • Structured releases • Everything tested first • Full documentation produced

  26. LESSONS LEARNT • Plan testing at project inception • Start testing early • Expect the worst • Gather metrics • Measure, Monitor & Manage • Be prepared to change • Testing is not Development contingency ! ! !

More Related