1 / 42

Operation Successful Patient Dead

Operation Successful Patient Dead. Margaret Dineen, Encompass Testing. Let’s Test Oz 2014. Learning Meeting new people Growing Sharing …. Sharing …. I’d like to share an “Experience” with you Experience = what you get when you don’t get what you want!. Before we start ….

candra
Download Presentation

Operation Successful Patient Dead

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Operation Successful Patient Dead Margaret Dineen, Encompass Testing

  2. Let’s Test Oz 2014 • Learning • Meeting new people • Growing • Sharing …

  3. Sharing … • I’d like to share an “Experience” with you • Experience = what you get when you don’t get what you want!

  4. Before we start … • Walkthrough of a project I worked on recently • I’d like you to think about times you’ve found yourself in a similar situation and how you handled things • There will be bribes prizes for sharing!

  5. Is this your experience? • Perfect, thoughtfully-written, complete requirements • Wonderful, supportive team – your manager bakes cakes for you • Set your own deadlines – why don’t you take twice that time for testing – just in case! • Daily group hugs • Saved the world by finding the right bugs at the right time • On time, under budget, no bugs in production • Customers LOVED the software • You got an award for being just so AWESOME

  6. It is? • REALLY???? • Can I get a job with you please?!

  7. Is the following closer to the truth? • Technical challenges • Interpersonal challenges • Questioned your sanity and your ability • Questioned everybody else’s sanity and their ability • Crazy stuff .. You want what? • Always prioritising and juggling based on limited time available

  8. Let’s talk about one such project • Financial organisation • Authentication device • Background: change of manufacturer based on high failure rate of previous devices • New device, minimal changes to firmware, application or back-end system

  9. Client mandate • We want you to run a basic test which we will define (register device, login, perform transaction, logout) • We want you to run the test across 48 operating system/anti-virus combinations • We will provide the provisioned devices • We will provide the test environment • You have 10 days to complete testing

  10. A bit of clarification… • Us: Who is going to provision these devices? Is provisioning part of the project scope? • Client: That’ll outside of the test scope; it’s a simple process and it’s all under control • Us: Ok and another thing – you’re asking us to test physical devices on a virtual environment, that seems a bit risky. How do we know the environment will give the same results as the real production environment? • Client: We know everything. Don’t worry your head about this stuff, we’ve got it sorted!

  11. How hard could it be?

  12. Our approach ✔ • Client expectations set up-front • Risks identified up-front • Mission of testing clearly identified and agreed • Communication strategy agreed up-front • Hardware devices provided to us up front • Test environment provided to us up front • We ran the tests, we reported on the results ✔ ✔ ✔ ✔ ✔ ✔

  13. So ..

  14. Anything missing from the picture? • What did we miss? • What would you have done differently? • Any red flags yet? • SHARING TIME

  15. Why the odd title? • Actual feedback from the project manager at the end of the project • Yes – it did suck • BIG time!

  16. Outcome … • I screwed up • I misread signals, ignored red flags, crashed straight into icebergs and from client perspective, I failed to provide value • As a tester, not a good place to be • I thought I was doing a great job so I hadn’t expected such bad feedback

  17. What failure felt like • Dark • Self-doubt • Confidence-crushing • Numerous replays of situation and how I could have handled things differently • Why?

  18. Reality • Once project was under way, each team was so focused on their own component delivery that communication ground to a halt (visualise hamsters running on a little wheel) • Testing was seen as “will be managed later” • Risks were not managed; they were outside of our control; small project risks had huge impact on testing • Hardware devices were incorrectly provisioned but we didn’t know that! • Test environment was unstable and outside of our control • Back-end system was unstable and outside of our control

  19. Put on the brakes! • What were we thinking??? • Why were we thinking it?!

  20. Test headspace at start • Clean slate • Everything new, everything is awesome • Lots of questions; formed a conceptual picture of project • Clarity and understanding • Good communication; everyone has time to talk about doing things right • Relationships already starting to form • One big happy family working TOGETHER with a COMMON FOCUS

  21. Test headspace during project • Must meet objectives – must get the job done • Time’s running out; must work faster • Just meet the objectives and report on findings • No time to dig too much into detail • BUT …

  22. Getting the job done was based on the conceptual picture formed at the start of the project • Was “getting the job done” still the best place to focus our effort? What should we have done?

  23. Test headspace at end of project • It’s all bad • The sky is falling • The hardware is terrible • The software is shocking • Pull the plug .. Pull the plug

  24. Client perspective at end of project • Testers always cry wolf, silly testers • PM: project delivered successfully, almost on time and almost on budget • What a hero!

  25. Why such a discrepancy between stakeholder and tester? • What do you think? • Where do you think it started to go wrong? • SHARING TIME

  26. Where did it start to go wrong? • It started to go wrong when I didn’t listen to my gut • Evidence found during testing would have meant something completely different if • We had really MANAGED our risks • We had been more confident about our misgivings • Exercise was so tightly time-boxed there was little time to react

  27. The downhill slide • Not listening to my gut started a chain of events leading to failure on my part • Failure = • Σ(chain of little mishaps each of which could have been avoided) • No one catastrophic event • Once you’re on a downhill slide, the only thing you can really change is how quickly you get to the bottom

  28. Why did I miss the obvious? • What do you think? • SHARING TIME

  29. Some of the reasons for failure • Cognitive dissonance • Poor risk management • Complacency – lost the “context” out of “context driven” • I didn’t take enough time to really understand the problem, the steps required to solve it and the steps required to understand what was going on around me and react appropriately

  30. Does this sound familiar? • Cognitive dissonance • Mental stress or discomfort experienced by an individual who holds two or more contradictory beliefs, ideas, or values at the same time, or is confronted by new information that conflicts with existing beliefs, ideas or values • Leon Festinger’s theory of cognitive dissonance – we strive for internal consistency. When dissonance is present, we try to reduce it and actively avoid information and situations which would likely increase the dissonance. • Let’s talk about aliens!

  31. It also affects Dilbert!

  32. So how does this relate to the project? • Initial conceptual model had: • Correctly provisioned devices • Stable environment • When bugs appeared, I couldn’t even see that these two areas might be the cause of the bugs – my mind was focused elsewhere • I was blind to the facts because my brain wanted to reduce the dissonance

  33. Poor risk management • Risks don’t just disappear all by themselves • They need someone to love them: • Recognise them • Own them • Have pre-defined action plans to manage them • We only did the first step because we thought someone else had them all under control.

  34. Complacency • I failed to remember that every project is different because the people dynamics are different • I can’t say I’m truly context-driven if I am regurgitating the same solution / process without first even recognising that this is a different problem to be solved • A half-baked attempt at CDT is easy but real CDT is hard!

  35. The impact of failure on me as a person • Questioned my ability as a tester • Questioned my desire to remain a tester • Wanted to run for the hills • I chose flight rather than fight • These feelings didn’t magically disappear overnight

  36. What I learned • Project direction and objectives may be set at the start of a project but testing must be able to anticipate and react to changes in circumstances during the project • When I feel “that’s a bit odd” I need to question • Communication strategies must be agile • I need to step back, defocus and question my beliefs – they may be wrong

  37. Attitude • If I’d had a different attitude, would I have gotten a different outcome? • Does anyone have any insights as to how they react in situations like this? • Are there any tools you use? • SHARING TIME

  38. Additions to my toolkit • Understand my scope of control • Regular defocussing; I need to look at all the evidence; not just one piece • I need to ask “Am I adding value” and not be afraid to deviate from “getting the job done” • I need to listen to my gut and acknowledge distress signals • I need to ask questions; they might not be as stupid as I think

  39. My enhanced toolkit (contd) • Self-evaluation / notebook of woe • How I feel during the project • Deviations from the plan or my conceptual model and why • How well is it going? • What can I do better? • Down-time • Gotcha’s • Every experience adds something to my tester’s toolset. The tools are always there for me to use but I need to have the wisdom look through my existing tools or to develop new ones depending on the CONTEXT.

  40. It’s true that • Failure provides an opportunity for both learning and growth • BUT • It’s not comfortable and it’s not easy. • Maybe it’s worth sharing our failures as often as we do our successes. It may help us cope better with the hard stuff and become better testers.

  41. And finally … • Thank you for listening to my tale of woe.

  42. Questions/Comments

More Related