1 / 21

Composing a Framework to Automate Testing of Operational Web-Based Software

Composing a Framework to Automate Testing of Operational Web-Based Software. Reham Alhejiali, Chris Cederstrom, Ranjitha Kashyap. Motivation. Web based application : Example: E-commerce Characteristic: 1.Short time to market. 2.Extensively used. 3.High reliability expected.

lindsey
Download Presentation

Composing a Framework to Automate Testing of Operational Web-Based Software

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Composing a Framework to Automate Testing of Operational Web-Based Software Reham Alhejiali, Chris Cederstrom, Ranjitha Kashyap

  2. Motivation Web based application: Example: E-commerce Characteristic: 1.Short time to market. 2.Extensively used. 3.High reliability expected. 4.Demand for continuous availability. 5.Large user community.

  3. Goal -To address the need for an automated cost- effective testing strategy for web application. - To investigate the scalability and evolving the tests suites to automatically adapt as the application operation profile changes.

  4. Outline 1.General framework for testing web-based application. 2. State of the art. 3.The testing framework . 4. Demonstrates the use of the framework with case study. 5. conclusion.

  5. Figure 1. General Framework for Testing Web-based Applications ``````````````````````````````

  6. State of the Art • User-Session-based testing -Test cases: - Created by capturing real user interactions and utilizing the user sessions as representative test cases of user behavior. -Testers could use the collected user sessions during maintenance to enhance the original test suite. -The user sessions provide the test data that represents usage not anticipated during earlier testing stages and that evolves as operational profiles change. -Webking and Rational Robot Generated test cases may not be adequate. -Elbaum et al(2003) Reduction technique employed is not scalable.

  7. continue: • Functional testing: -link, form testers -Test the functionality of application. • Program-based testing -Test cases: based on the data flow between objects in the model. Liu et al (2000) -WATM -Test based on the data flow between objects in the model. -ICFG Di Lucca et al(2002) -Developed a web application model and set of tools for the evaluation and automation of testing web application. -Test case generation not completely automated.

  8. Testing Framework and Initial Prototype

  9. Test Case Generation • Using a Web server to log user sessions for test case generation • Modified the AccessLog class of the Resin web server to log user sessions in a specified format: • IP address • Timestamp • Requested URL • Cookies • Name-value of GET/POST requests • Referrer URL • Each user session is a collection of user requests in the form of URL and name-value pairs.

  10. Replay Tool & Coverage Analysis • Replaying user sessions are viewed using wget for each entry in the user session log • These requests will emulate user interactions and input • Cookies are used to recognize the user and sessions • Coverage analysis is to evaluate the effectiveness of the generated test suite • Coverage is calculated by Clover • Clover returns the analysis of statements and methods used

  11. Coverage Analysis with Clover Clover: http://www.atlassian.com/software/clover/overview

  12. Oracle and Fault Detection • Oracle generate expected results using a properly working application • Run the user sessions on the working application and new version and store results in database • Results of working application is compared to output from test runs to confirm validity and detect faults

  13. Case Study • A medium sized web application like bookstore was considered for test • Front end had 11 jsps and classes • LOC:9,748, Methods: 385 • Collected 123 User sessions WEB Access Log User

  14. User Session Log 0.197.37.159 [03/Feb/2004:16:14:05 −0500] GET /apps/bookstore/Default.jsp ] −−cookies=off] − 10.197.37.159 [03/Feb/2004:16:16:27 −0500] GET /apps/bookstore/Registration.jsp ] −−cookies=off −−header "Cookie:JSESSIONID=a7mpavbwGTf6"] http://dwalin.cis.udel.edu:8080/apps/bookstore/Default.jsp 10.197.37.159 [03/Feb/2004:16:17:22 −0500] GET /apps/bookstore/Registration.jsp ?member_login=bobmason& member_password=14921492&member_password2=14921492& first_name=bob&last_name=mason& email=bobmason%40udel.edu&address=&phone=& card_type_id=&card_number=& FormName=Reg&FormAction=insert& member_id=&PK_member_id= ] −−cookies=off −−header "Cookie:JSESSIONID=a7mpavbwGTf6"] http://dwalin.cis.udel.edu:8080/apps/bookstore/Registration.jsp

  15. 10.197.37.159: GET.apps.bookstore.Default.jsp GET.apps.bookstore.Registration.jsp GET.apps.bookstore.Registration.jsp GET.apps.bookstore.Default.jsp GET.apps.bookstore.Login.jsp POST.apps.bookstore.Login.jsp ; 10.82.161.133: GET.apps.bookstore.Default.jsp GET.apps.bookstore.images.icon_reg.gif GET.apps.bookstore.images.icon_home.gif GET.apps.bookstore.Registration.jsp GET.apps.bookstore.Registration.jsp GET.apps.bookstore.Default.jsp ; objects[001]: 0.82.161.133 attributes[001]: GET.apps.bookstore.images.icon_home.gif GET.apps.bookstore.images.icon_reg.gif Test case generation and execution Access Log Replay Tool Access Log Parser Concept Analysis Tool Test Suite Reducer

  16. Example Of Using Clover

  17. Coverage Report for Bookstore

  18. Seeded Fault - Technique to measure Test Coverage - Known bugs are randomly added to the source code - During testing, the percentage of known bugs not found indicates the real bugs that remain.

  19. Cost and Scalability

  20. Conclusion • Discussed the framework for automating the testing process for web-based software • Focuses on usability, costs, and scalability to evolving the test suite • Used as one of the measures of fault detection for future evaluation • Code Coverage

More Related