1 / 35

Construction of testing processes from the scratch. History of one project

Construction of testing processes from the scratch. History of one project. Anna Tymofeeva and Ivanka Mykulovych. About Anna. Almost 12 years in IT, 6 years QA Lead 4 years at SoftServe QMO Partner Graduated from Leadership Development Program SoftServe Hillel IT School Coach

hyde
Download Presentation

Construction of testing processes from the scratch. History of one project

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Construction of testing processes from the scratch. History of one project Anna Tymofeeva and Ivanka Mykulovych

  2. About Anna Almost 12 years in IT, 6 years QA Lead 4 years at SoftServe QMO Partner Graduated from Leadership Development Program SoftServe Hillel IT School Coach Volunteer “Open Eyes” Foundation Mother of teenage girl e-mail: atymof@gmail.comskype: ann_alen

  3. About Ivanka 2+ years in IT, working at SoftServe discovering test automation world during last two years working with team on adapting interesting and useful things to project needs main passion - travelling e-mail: x-iva@ukr.netskype: ivano4ka_hiii

  4. Why Test Automation? Client’s Perspective Reduce operational cost. Increase revenue.

  5. Deadline - yesterday

  6. “We have Automated Tests”!!! Test starts failing randomly

  7. “We have Automated Tests”!!! When you need the test to run - It never works

  8. Problem, eh? Test execution is slow Tests are fragile Tests are not reusable Tests needs more infrastructure Test works on my box but fails on daily run.

  9. What? More time spent on figuring why the tests are failing Resource tied up just reviewing failures Maintenance takes way too much time

  10. Result

  11. How to fix it? Prepare Project for automation Cooperation with Manual QC Teams

  12. Automation Architecture

  13. Acceptance criteria Precondition is generated additionally Script covers all all scenarios’ steps Automation test is stable (It is better to have 50 stable test cases than 500 fragile ones that breaks regularly.) Automation test can be run in parallel Not more than accepted time (should be set for every test by QA Lead/Manual QA/Product Owner) Test is independent and separated Test is passed during first regression without issues

  14. Precondition is generated additionally split test steps and test preconditions use API (if possible) to generate test preconditions

  15. Automation test is stable avoid repeatable UI steps in bunch of tests split long test into few smaller move often used hardcoded test data out to test data file use test fixtures (set up, teardown, etc) clean up after yourself!

  16. Test Fixtures eliminate code duplication (test set up and teardown) provide fixed test environment

  17. Test is independent and separated Test = test + precondition for next test separate test ENV for each test

  18. Automation test can be run in parallel • parallel and non-parallel test scope • rerun parallel scope failures after first run in one flow

  19. Not more than accepted time smoke vs regression test scope (per app / per feature) limited test steps number test tags

  20. Tags • feature • environment • test type • browser • parallel / non parallel 1 test = few tags • Keep test marks list file in project • Align test marks with test suites. It will make life easier

  21. Tests are reusable browsers/OS support test parameterization

  22. Test is passed during first regression without issues 2b|!2b oint

  23. Test Planning Project roadmap creating Documentation developing: Collect typical uses cases and associated scenarios; Formalize functional and non-functional requirements to System; Develop HeatMap with a business value per feature; Create list of scenario for automation (only Critical test cases); Develop Test Plan per sprint for automation; Formalize story definition of done: All issues are fixed; Test scenario is created (could be used for automation) Test scenario is added to regression/functional scope Unit test is developed Heat map is updated Estimates: Estimate time for automation tests developing according automation test plan

  24. Automation Critical UI Regression tests Set priority for automation from regression scope Tests will be run every regression Manual tests will be run using Heat Map

  25. Automation process 1. Test Scenario should be developed by QA Lead/Manual QA2. Script will be developed by Automation QA3. Developed script should be reviewed by QA Lead 4. Test will be added to automation regression scope5. New tests will be run in regression scope6. When test is stable in parallel run can be added to smoke

  26. Automation run result Test Plan should be created for every regression Automation run result should be set in Test Plan Every fail should be analyzed and bug in Bug Tracking system should be created System will track time for every page loading

  27. Test Plan

  28. Automation Result TOOL

  29. Implementation test teardown Custom report methods Tool API endpoints SWITCH to enable tool reporting APIClient

  30. Usage Report test run status to Test Plan Report test failure to Bug tracking tool

  31. Supported Browsers Chrome is main browser for Smoke tests FireFox and IE are run during regression Safari should be run for Mac (can be run locally)

  32. TimeLines

  33. QUESTIONS ANDANSWERS

More Related