1 / 21

Basics of Performance Testing

Basics of Performance Testing. Abu Bakr Siddiq. Topics to be discussed Introduction Performance Factors Methodology Test Process Tools Conclusion. Introduction. Why Performance Test? What is Performance Test? How Performance Test? When to Performance Test?

clive
Download Presentation

Basics of Performance Testing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Basics of Performance Testing Abu Bakr Siddiq Topics to be discussed Introduction Performance Factors Methodology Test Process Tools Conclusion

  2. Introduction Why Performance Test? What is Performance Test? How Performance Test? When to Performance Test? Performance Vocabulary

  3. Why, What, How & When Performance Test • Why Performance test the application? To improve the software speed, scalability, stability and confidence of the system under different loads for desired performance. Ex: USA Shopping in Dec . Indian Railway Reservation. • What is Performance testing? Testing for performance factors against the acceptable or suggestible configuration is performance testing. • How Performance test the application? Using a Tool which follows Performance Methodology & Performance Process • When Performance test the application? Available Performance Requirements, Stable Build, Resource Availability, Defined Performance Plan and methodology, Set entry and Exit Criteria.

  4. Performance Testing Vocabulary The different types of performance testing • Capacity Testing:Determining the server’s failure point is called Capacity Testing. • Component Testing: Testing the architectural component of the application like, servers, databases, networks, firewalls, and storage devices. • Endurance Testing: Testing for performance characteristics with work load models anticipated during production. • Load Testing: Subjecting the system to a statistically representatives(load) is called Load Testing. • Smoke Testing: A initial run to see the system performance under normal Load. • Spike Testing: Testing for performance characteristics with work load models during production which repeatedly increase beyond anticipated. • Stress Testing: Evaluating the application beyond peak load conditions. • Validation Testing: Testing against the expectations that have been presumed for the system • Volume Testing: Subjecting the system to variable amount of data and testing for performance is Volume Testing. Stress Testing Through put Saturation point Load Testing User Load

  5. Performance Factors Throughput Response Time Latency Tuning Benchmarking Capacity Planning

  6. Throughput, Response Time & Latency Throughput : The number of requests/business transactions processed by the system in a specified time duration. Client Web Server DB Server Internet N1 N2 A1 A2 O1 A3 N3 N4 Network Latency = N1 + N2 + N3 + N4 Product Latency = A1 + A2 + A3 Actual Response Time = Network Latency + Product Latency Latency = Actual Response Time + O1

  7. Tuning, Benchmarking & Capacity Planning • Tuning : Procedure by which the system performance is enhanced by setting different values to the parameters (variables) of the product, OS and other components. Ex: “Search” • Benchmarking: Comparing the throughput and response time of the system with that of competitive products . Ex: Comparing Open office with that of MS- Office. • Capacity Planning: The exercise to find out what resources and configurations are needed. Ex: suggesting the ideal software, hardware and other components for the system to the customer.

  8. Methodology Collecting Requirements Writing Test Cases Automating Test Cases Executing Test Cases Analyzing Results Tuning Benchmarking Capacity Planning

  9. Collecting Performance Requirements • Performance Requirements types: • Generic • Specific • Performance Requirements Characteristics: • Testable • Clear • Specific • Sources for gathering Requirements: • Performance compared to previous release of the same product • Performance compared to the competitive product(s) • Performance compared to absolute numbers derived from actual need • Performance numbers derived from architecture and design • Graceful Performance Degradation

  10. Authoring Test Cases • Performance Test Cases should have the following details: • List of Operations or business transactions to be tested • Steps for executing those operations/transactions • List of Product, OS parameters that impact the Performance Testing • and their values • Loading Pattern • Resource and their configuration (network, hardware, software configurations • The expected results • The product version/competitive products to be compared with the related information such as their corresponding fields

  11. Automating Performance Test Cases • Performance Testing naturally lends itself to automation, reasons : • Performance testing is repetitive • Performance test cases cannot be effective without automation • Results of Performance testing need to be accurate, manually calculating may introduce inaccuracy • Performance Testing takes into account too many permutations and combinations hence if done manually will be difficult • Extensive Analysis of performance results and failures needs to be taken into account, is very difficult to do things manually

  12. Executing Performance Test Cases • The following aspects need to be considered while executing Performance Tests: • Start and End time of Execution • Log and trace/audit files of the product and OS. (for future debugging and repeatability) • Utilization of resources ( CPU, memory, disk, network utilization and so on) on periodic basis • Configuration of all Environmental factors (Hardware, software and other components) • The performance factors listed in the test case documentation at regular intervals

  13. Analyzing Performance Test results • Performance Test results are concluded as follows: • Whether performance of the product is consistent when tests are executed multiple times • What performance can be expected for what type of configuration • What parameters impact and how they can be derived for better performance • What is the effect of scenarios involving a mix of performance factors • What is the effect of product technologies such as caching • What Loads of performance numbers are acceptable • What is the optimum throughput and response time of the product for set of factors • What performance requirements are met how the performance looks compared to previous version or expectations

  14. Performance Tuning • Two ways to get optimum mileage • Tuning the Product parameters • Tuning the Operating system parameters • Product parameters: • Repeat the Performance tests for different values of each parameters that impact • A Particular Parameter change values is changed, it needs changes in other • Repeat the tests for default values of all parameters( factory settings tests) • Repeat the performance tests for low and high values of each parameter and combinations • Operating system parameters: • Files system related parameters • Disk Management parameters • Memory Management parameters • Processor Management parameters • Network management parameters

  15. Performance Benchmarking & Capacity Planning • The steps involved in Performance Benchmarking are the as follows: • Identify the transactions/ scenarios & the test configuration • Comparing the performance of different product • Tuning the parameters of the products bring compared fairly to deliver the best performance • Publishing the results of performance benchmarking • Capacity Planning is identifying the right configuration, which is of 3 types: • Minimum required configuration • Typical configuration • Special configuration

  16. Test Process Resource Requirements Test Lab Setup Responsibilities Setting up Traces & Audits Entry & Exit Criteria

  17. Performance Test Process

  18. Resource Requirements: All the resources are specifically needed , hence shall be planned and obtained. Resources are to be exclusively dedicated to the current system without interchanging the roles and responsibilities often. Test – Lab Setup: Test Lab with all required equipment is to be setup prior to execution. Test lab has to be configured cautiously as a single mistake can lead to running the tests again. Responsibilities: Performance defects may cause changes to architecture, design and code. Team facing customer communicates requirements for performance. Multiple teams are involved in Performance testing the system. Hence a matrix which describes the responsibilities of the team is part of the test plan.

  19. Setting up Product traces, Audits: Performance test results need to be associated with traces and audit trails to analyze the results. Audits and traces are to be planned in advance or else may start impacting the performance results. Entry and Exit Criteria: Performance tests require a stable product due to its complexity and accuracy that is needed. Changes to the product mean tests are to be repeated. Hence Performance tests starts after the product meets a set criteria.

  20. Tools for Performance Testing Commercial Tools Load Runner – HP QA Partner - Compuware Silk Test - Segue MS Stress Tool – Microsoft Open Source Tools Web Load JMeter Open STA Challenges Conclusion

  21. Thank you

More Related