1 / 26

Calibration of Performance Test tooling Roland van Leusden

Calibration of Performance Test tooling Roland van Leusden. Introduction. Senior Automation Specialist. 10+ years of experience in TA&PT. Passion For Technology. e: roland.van.leusden@squerist.nl m: 06 - 30 143 539. Contents. What triggered this investigation

niesha
Download Presentation

Calibration of Performance Test tooling Roland van Leusden

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Calibration of Performance Test tooling Roland van Leusden Proven Quality

  2. Introduction Senior Automation Specialist 10+ years of experience in TA&PT PassionFor Technology e: roland.van.leusden@squerist.nl m: 06 - 30 143 539 Proven Quality

  3. Contents • What triggered this investigation • Research definition & Scope • First Findings • Testenviroment setup • Testplan & Scenario • Results • Calibration • Reference • Tool Behaviour & Solutions • Results after calibration • Conclusion • When to calibrate ? • Next Steps Proven Quality

  4. What triggered the investigation Historical International Prototype Metre bar, made of an alloy of platinum and iridium, that was the standard from 1889 to 1960 Q: Do performance test tools emulate the exact same behavior a real user would show when accessing the application ? Proven Quality

  5. Research Definition • Scope: • 4 commonly used test tools, installed with default settings • Mix of commercial and Open Source tools • 1 virtual user to execute the test • 1 Browser • 1 OS • Goal: • Identify differences. • Adjust settings, calibration. Proven Quality

  6. First Findings • Tooling doesn’t emulate browser behaviour like javascript execution. • Different behavior at network level, number of tcp sessions • Different behavior in requests methods and number of requests. Proven Quality

  7. Test Environment Setup • VMWare XP images: • Tool A • Tool B • Tool C • Tool D • Clean image Server: Bootable Linux CD Turnkey Linux & OsCommerce http://www.turnkeylinux.org/oscommerce Proven Quality

  8. Test Plan & Scenario • 1. Execute the testscenario manually, this is the Reference. • 2. Record the scenario with the testtool. • 3. Run the scenario with the testtool. • Scenario: • Add the dvd “Speed” on the front page to the cart • Search for a dvd with the word: “Mary” • Add dvd “There’s something about Mary” to the cart • Checkout • Create a new account • Complete the checkout & shipping • Return to the frontpage and logout Proven Quality

  9. Results of the test Results are average of 10 runs. Proven Quality

  10. Calibration • Create the reference by analysing user behaviour with the application. • Analyse tool behaviour with the application. • Calibrate tool. • Validate calibration by rerunning the test. Proven Quality

  11. Reference with FireBug Use the profiler to determine Javascript execution time Proven Quality

  12. Reference with WireShark Proven Quality

  13. Reference TCP Sessions 5 sessions Proven Quality

  14. Tool behaviour and Solutions Proven Quality

  15. Compensate Browser processing Problem identified: Tooling doesn’t do processing like a browser. (javascript, rendering) Solution: Compensate in the tool for clientside processing and if needed during a loadtest execute the scenario manually to capture the user experience. Proven Quality

  16. Post en Get Requests Reference Tool A Tool B Proven Quality

  17. Adjust get & post request setting Problem identified: Different number of Requests, GET and POST Possible Solution: Adjust settings on how to Retrieve resources Proven Quality

  18. Adjust for Caching Problem identified: Some content is cached within the browser. Solution: Retrieve cached content only once for every VU. Without browser caching With Browser caching Compensation for cached content using an “Only once Loop Controller”. Cached_Content_StartPage.csv Proven Quality

  19. Network behavior Reference Tool Proven Quality

  20. TCP Sessions Reference 5 sessions Tools 8 ~ 32 sessions Proven Quality

  21. Adjust settings for connections Problem identified: Different behavior at network level caused by different number of connections used Possible Solution: Adjust settings for number of connections per virtual user Proven Quality

  22. Results after Calibration Proven Quality

  23. Conclusion The behavior and reported results from Performance Testtools need to be: VALIDATED & CALIBRATED Proven Quality

  24. When to Calibrate ? • When the tool does partially / not support client behavior • Performance testing with more then 50 VU. • Most used / performance critical scenario. • No tool selection process has been done, available tooling is used. Performance testing & calibration works very well in an Agile development process Proven Quality 24

  25. How do we do this? At Squerist part of our intake and tool selection process for performance testing is the calibration and validation of the tools we are going to use. We monitor developments in the market on tooling, network and application level and conduct our own research. Proven Quality

  26. Next steps NO!!! Are we finished? • We want to research the influence of: • Different browsers • Non web applications • Different OS • Different amounts of VU • On performance results and user experience Why? Because we are testers and our customers deserve accurate results!!! Proven Quality

More Related