1 / 29

Synchronizing Software Testing with Agile Requirements Practices

Synchronizing Software Testing with Agile Requirements Practices. Jean McAuliffe, Dean Leffingwell May 2005. Jean McAuliffe Product Manager, Rally Software Development 2 years Agile Development, Certified Scrum Master Former Senior QA Manager for Rational RequisitePro.

Rita
Download Presentation

Synchronizing Software Testing with Agile Requirements Practices

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Synchronizing Software Testing with Agile Requirements Practices Jean McAuliffe, Dean Leffingwell May 2005

  2. Jean McAuliffe Product Manager, Rally Software Development 2 years Agile Development, Certified Scrum Master Former Senior QA Manager for Rational RequisitePro. Over 15 years experience in all aspects of software development (defining, developing, testing, training and supporting) for Software Development, Bio-Engineering and Aerospace companies Dean Leffingwell Advisor and coach to a number of developmental-stage software businesses. Former Senior VP of Rational Software Corporation where he was responsible for the commercial introduction of the Rational Unified Process. Lead author of the text Managing Software Requirements: Second Edition: A Use Case Approach, Addison Wesley, 2003 Background on Speakers

  3. Abstract • Agile requirements practices generally defer commitment to artifacts such as requirements until the “last responsible moment.” This challenges the test team to stay continuously synchronized with the product owners and developers as they have little lead time, if any, from the time requirements are available until the time the test artifacts must be in place. • This presentation describes the organizational and technical challenges associated with just-in-time agile requirements practices and techniques test teams can use to address them.

  4. Agenda Context for Agile Testing Technical Challenges Organizational Challenges Keys for Success

  5. Popular Agile Methods

  6. Excerpts from the Agile Manifesto • http://agilemanifesto.org/principles.html • Our highest priority is to satisfy the customer through early and continuous delivery of valuable software. • Welcome changing requirements, even late in development. Agile processes harness change for the customer's competitive advantage. • Working software is the primary measure of progress. • Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale. • Business people and developers must work together daily throughout the project. • Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done.

  7. A Generalized Agile Release Process Release Iteration 3 Iteration … Backlog Iteration 1 Iteration 2 Backlog • Feature 8 • Feature 9 • Feature 10 • …. • Do Feature 1 • Do Feature 2 • Do Feature 3a • Do Feature 3b • Do Feature 4a • Do Feature 4b • Do Feature 5 • Do Feature 4c • Do Feature 6 • Do Feature 7 • Feature 1 • Feature 2 • Feature 3 • …. • Feature 9

  8. Agile Iteration Cadence Requirements Are Refined Demo & Retro Detailed Iteration Planning & Design Dev Feature Priority 1 Accept Dev Feature Priority 4 Accept Auto. Tests Feature 1 Auto. Tests Feature 4 Initial Elaboration Requirements With Tests Dev Feature Priority 2 Accept Dev Feature Priority 5 Accept Auto. Tests Feature 2 Auto. Tests Feature 5 Dev Feature Priority 3 Accept Auto. Tests Feature 3 Iteration N-1 Iteration N+1 Iteration N

  9. What’s Different about Testing in Agile? • Just-In Time Requirements Elaboration • No SRS-level waterfall documents to drive testing plan • Requirements and Test Cases developed in parallel or test first strategy • More Frequent Iterations, More Frequent Releases • Testing needs to happen Early and Often • Frequent to continuous regression testing • High need to automate nearly everything • Everyone needs to Test • Two Levels of Testing • Iteration Vs. Release testing patterns

  10. Technical Challenges Requirements are changing fast. How does test keep up? Test early and often. How exactly do we move testing forward? Need to move off manual testing and more into automation. How does this happen? Different kinds of testing need to happen at different times. How do these get managed?

  11. Requirements are Changing Code & Deliver Code & Deliver Code & Deliver Updates Fail TCs Pass & Accept UC/SR Updates Generate TCs Update TCs Run TCs

  12. Requirements Changing is a Good Thing? • Probably the hardest agile principle for the team to embrace. • Need to elaborate the feature ahead of time • There is minimal time to have the team review before the start. • Sometimes you have to rewrite • Bottom-line: everyone collaborates to make the feature as useful for the customer as possible.

  13. Requirements to Test Cases • Use Case Scenario Tests are perfect Acceptance Tests • Use Case A • Scenario 1 Test Case 1 • Scenario 2 Test Case 2 • Declarative Requirements that further refine the Use Case may be better suited to going directly to automation • Have one Test Case be the container for all of the automation results. • All automated tests have to pass before the Test Case passes.

  14. Need to Test early and often • Need to test early in the Iteration – do not want mini-waterfalls • Need to test on check-in – Don’t break the build • Need to test nightly – Don’t wait for a Regression Iteration

  15. GUI Acceptance Tests FitNesse Unit Tests Mike Cohn’s Testing Pyramid • Small number • Automate many • Find the right ones • Largest numbers • Foster Test Driven Design Start Start Stop Stop Look ?

  16. Break the Manual Testing Paradigm • Easy to Create • Very familiar – what we always do • Typically tedious • How do we know coverage? Manual GUI Acceptance Tests • Need Automation specialists • Automation good for performance • Seems like we always rewrite • Sometimes fragile Automated GUI Tests Unit Tests • What is Dev testing? • How do we know what these are? • How do we know when they fail? Start Start Stop Stop Look ?

  17. Manual Testing Conundrum • “You can never have too many manual acceptance tests” • Manual tests are cute little bunnies, before you know it you have hundreds or thousands in your regression suite • You inadvertently dig a hole you can never get out of • Whole team had to help run regression suite • Defect count typically is high • Most defects were found as manual tests were elaborated • Regression tests typically didn’t find many defects • Commonly found defects – things we didn’t think of

  18. Manual GUI Acceptance Tests Automated GUI Tests & FitNesse Unit Tests Better, But Not Perfect Testing Architecture • Still too many here • Add FitNesse • Increase Coverage • Increase Capability Start Stop Look

  19. Testing Types and Scheduling

  20. Keys to Overcome the Technical Challenges • Continuous Builds • Nightly Regression testing • Find a way to increase FitNesse testing at the application layer http://fitnesse.org • Make Unit Testing a priority • From found defects – create automated tests that go into Regression

  21. Organizational Challenges Dev as Testers and Testers as Dev – how does that happen? Resistance to Change – how do we get the team to welcome and embrace changes and not feel threatened? Testers are an integral part of the team- do we need to re-organize to make this happen?

  22. I’m a Developer, Not a Tester • Pretty typical to hear push back from developers that they • Don’t have time to do all of this testing • Number of features delivered will go down • Don’t really want to do all this testing • Testers can help • Provide guidance on how to break software, art of creative destruction • Pair testing with developers works well • Have developers help out with manual regression testing. • “Can’t I write a test for this instead of running it manually?”

  23. I’m a Tester Not a Developer • Pretty typical to hear from testers • That they don’t feel comfortable or knowledgeable about coding • That they maybe won’t be needed anymore • Developers can help • Developers can create the fixtures (code running the test) needed to make FitNesse testing work • To make it easier to auto test the code at the GUI level

  24. Resisting Change • Resistance is common • It is easier to do what is familiar, than risk something new • Time-challenges may keep you doing the old way • Fear of failing keeps you in the status quo • Get the whole team involved in trying to change • Team needs to figure what works best • Don’t feel like you have to do everything all at once • Keep learning and adapting

  25. Testers on the Team • Your organization may have testing as a separate group – look for ways to integrate them into the team • Creating feature or component teams comprised of all disciplines is one way • Co-location is a great way to hear and share information • Daily stand-ups with the whole team keeps the information current

  26. Keys to Overcome the Organizational Challenges • Have Dev help run manual Regression tests • Pair Dev and Test on Unit and FitNesse Testing • Co-location of all the team • Daily Standups • Do Retrospectives

  27. Summary • Agile Pulls Testing Forward • You need to change your tools and approaches to move it forward • You might need to change the model/structure of your team • With Agile, you will create faster Release cycles, shorter Iterations, more satisfied customers, and team members that enjoy what they are doing

  28. Useful References • Beck, Kent, Test-Driven Development By Example, Addison Wesley, 2003 • Cohn, Mike, User Stories Applied For Agile Software Development, Addison Wesley, 2003 • Crispin, Lisa, House, Tip, Testing Extreme Programming, Addison Wesley, 2003 • Leffingwell, Dean, Widrig, Don,Managing Software Requirements: Second Edition: A Use Case Approach, Addison Wesley, 2003

  29. Thank YouContact InfoJean: jean.mcauliffe@rallydev.comDean: dleffing@earthlink.netQuestions? Copyright 2003-2005, Rally Software Development Corp

More Related