user session based testing of web applications l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
User-session based Testing of Web Applications PowerPoint Presentation
Download Presentation
User-session based Testing of Web Applications

Loading in 2 Seconds...

play fullscreen
1 / 35

User-session based Testing of Web Applications - PowerPoint PPT Presentation


  • 289 Views
  • Uploaded on

User-session based Testing of Web Applications. Two Papers. A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis Uses concept analysis to reduce test suite size

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'User-session based Testing of Web Applications' - kuper


Download Now An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
two papers
Two Papers
  • A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis
    • Uses concept analysis to reduce test suite size
  • An Empirical Comparison of Test Suite Reduction Techniques for User-session-based Testing of Web Applications
    • Compares concept analysis to other test suite reduction techniques
talk outline
Talk Outline
  • Introduction
  • Background
    • User-session Testing
    • Concept Analysis
  • Applying Concept Analysis
    • Incremental Reduced Test Suite Update
    • Empirical Evaluation (Incremental vs. Batch)
  • Empirical Comparison of Concept Analysis to other Test Suite Reduction Techniques
  • Conclusions
characteristics of web based applications
Characteristics of Web-based Applications
  • Short time to market
  • Integration of numerous technologies
  • Dynamic generation of content
  • May contain millions of LOC
  • Extensive use
    • Need for high reliability, continuous availability
  • Significant interaction with users
  • Changing user profiles
  • Frequent small maintenance changes
user session testing
User-session Testing
  • User session
    • A collection of user requests in the form of URL and name-value pairs
  • User sessions are transformed into test cases
    • Each logged request in a user session is changed into an HTTP request that can be sent to a web server
  • Previous studies of user-session testing
    • Previous results showed fault detection capabilities and cost effectiveness
    • Will not uncover faults associated with rarely entered data
    • Effectiveness improves as the number of sessions increases (downside: cost increases as well)
contributions
Contributions
  • View user sessions as use cases
  • Apply concept analysis for test suite reduction
  • Perform incremental test suite update
  • Automate testing framework
  • Evaluate cost effectiveness
    • Test suite size
    • Program coverage
    • Fault detection
concept analysis
Concept Analysis
  • Technique for clustering objects that have common discrete attributes
  • Input:
    • Set of objects O
    • Set of attributes A
    • Binary relation R
      • Relates objects to attributes
      • Implemented as a Boolean-valued table
        • A row for each object in O
        • A column for each attribute in A
      • Table entry [o, a] is true if object o has attribute a, otherwise false
concept analysis 2
Concept Analysis (2)
  • Identifies concepts given (O, A, R)
  • Concept is a tuple (Oi, Aj)
    • Concepts form a partial order
  • Output:
    • Concept lattice represented by a DAG
      • Node represents concept
      • Edge denotes the partial ordering
    • Top element T = most general concept
      • Contains attributes that are shared by all objects in O
    • Bottom element  = most special concept
      • Contains objects that have all attributes in A
concept analysis for web testing
Concept Analysis for Web Testing
  • Binary relation table
    • User session s = object
    • URL u = attribute
    • A pair (s, u) is in the relation table if s requests u
concept lattice explained
Concept Lattice Explained
  • Top node T
    • Most general concept
    • Contains URLs that are requested by all user sessions
  • Bottom node 
    • Most special concept
    • Contains user sessions that requests all URLs
  • Examples:
    • Identification of common URLs requested by 2 user sessions
      • us3 and us4
    • Identification of user sessions that jointly request 2 URLs
      • PL and GS
concept analysis for test suite reduction
Concept Analysis for Test Suite Reduction
  • Exploit lattice’s hierarchical use-case clustering
  • Heuristic
    • Identify smallest set of user sessions that will cover all URLs executed by original suite
incremental test suite update 2
Incremental Test Suite Update (2)
  • Incremental algorithm by Godin et al.
    • Create new nodes/edges
    • Modify existing nodes/edges
  • Next-to-bottom nodes may rise up in the lattice
  • Existing internal nodes never sink to the bottom
  • Test cases are not maintained for internal nodes
  • Set of next-to-bottom nodes (user sessions) form the test suite
empirical evaluation
Empirical Evaluation
  • Test suite reduction
    • Test suite size
    • Replay time
    • Oracle time
  • Cost-effectiveness of incremental vs. batch concept analysis
  • Program coverage
  • Fault detection capabilities
experimental setup
Experimental Setup
  • Bookstore Application
    • 9748 LOC
    • 385 methods
    • 11 classes
  • JSP front-end, MySQL backend
  • 123 user sessions
  • 40 seeded faults
test suite reduction
Test Suite Reduction
  • Metrics
    • Test suite size
    • Replay time
    • Oracle time
incremental vs batch analysis
Incremental vs. Batch Analysis
  • Metric
    • Space costs
    • Relative sizes of files required by incremental and batch techniques
  • Methodology
    • Batch: 123 user sessions processed
    • Incremental: 100 processed first, then 23 incrementally
program coverage
Program Coverage
  • Metrics
    • Statement coverage
    • Method coverage
  • Methodology
    • Instrumented Java classes using Clover
    • Restored database state before replay
    • Wget for replaying user sessions
fault detection capability
Fault Detection Capability
  • Metric
    • Number of faults detected
  • Methodology
    • Manually seeded 40 faults into separate copies of the application
    • Replayed user sessions through
      • Correct version to generate expected output
      • Faulty version to generate actual output
    • Diff expected and actual outputs
empirical comparison of test suite reduction techniques22
Empirical Comparison of Test Suite Reduction Techniques
  • Compared 3 variations of Concept with 3 requirements-based reduction techniques
    • Random
    • Greedy
    • Harrold, Gupta, and Soffa’s reduction (HGS)
  • Each requirements-based reduction technique satisfies program or URL coverage
    • Statement, method, conditional, URL
random and greedy reduction
Random and Greedy Reduction
  • Random
    • Selection process continues until reduced test suite satisfies some coverage criterion
  • Greedy
    • Each subsequent test case selected provides maximum coverage of some criterion
    • Example:
      • Select us6 – maximum URL coverage
      • Then, select us2 – most marginal improvement for all-URL coverage criterion
hgs reduction
HGS Reduction
  • Selects a representative set from the original by approximating the optimal reduced set
  • Requirement cardinality = # of test cases covering that requirement
  • Select most frequently occurring test case with lowest requirement cardinality
  • Example:
    • Consider requirement with cardinality 1 – GM
      • Select us2
    • Consider requirement with cardinality 2 – PL and GB
    • Select test case that occurs most frequently in the union
      • us6 occurs twice, us3 and us4 once
      • Select us6
empirical evaluation25
Empirical Evaluation
  • Test suite size
  • Program coverage
  • Fault detection effectiveness
  • Time cost
  • Space cost
experimental setup26
Experimental Setup
  • Bookstore application
  • Course Project Manager (CPM)
    • Create grader/group accounts
    • Assign grades, create schedules for demo time
    • Send notification emails about account creation, grade postings
test suite size
Test Suite Size
  • Suite Size Hypothesis
    • Larger suites than:
      • HGS and Greedy
    • Smaller suites than:
      • Random
    • More diverse in terms of use case representation
  • Results
    • Bookstore application:
      • HGS-S, HGS-C, GRD-S, GRD-C created larger suites
    • CPM
      • Larger suites than HGS and Greedy
      • Smaller than Random
program coverage29
Program Coverage
  • Coverage Hypothesis
    • Similar coverage to:
      • Original suite
    • Less coverage than:
      • Suites that satisfy program-based requirements
    • Higher URL coverage than:
      • Greedy and HGS with URL criterion
  • Results
    • Program coverage comparable to (within 2% of) PRG_REQ techniques
    • Slightly less program coverage than original suite and Random
    • More program coverage than URL_REQ techniques, Greedy and HGS
fault detection effectiveness
Fault Detection Effectiveness
  • Fault Detection Hypothesis
    • Greater fault detection effectiveness than:
      • Requirements-based techniques with URL criterion
    • Similar fault detection effectiveness to:
      • Original suite
      • Requirements-based techniques with program-based criteria
  • Results
    • Best fault detection but low number of faults detected per test case - Random PRG_REQ
    • Similar fault detection to the best PRG_REQ techniques
    • Detected more faults than HGS-U
time and space costs
Time and Space Costs
  • Costs Hypothesis
    • Less space and time than:
      • HGS, Greedy, Random
    • Space for Concept Lattice vs. space for requirement mappings
  • Results
    • Costs considerably less than PRG_REQ techniques
    • Collecting coverage information for each session is the clear bottleneck of requirements-based approaches
conclusions
Conclusions
  • Problems with Greedy and Random reduction
    • Non-determinism
    • Generated suites with wide range in size, coverage, fault detection effectiveness
  • Test suite reduction based on concept-analysis clustering of user sessions
    • Achieves large reduction in test suite size
    • Saves oracle and replay time
    • Preserves program coverage
    • Preserves fault detection effectiveness
      • Chooses test cases based on use case representation
  • Incremental test suite reduction/update
    • Scalable approach to user-session-based testing of web applications
    • Necessary for web applications that undergoes constant maintenance, evolution, and usage changes
references
References
  • Sreedevi Sampath, Valentin Mihaylov, Amie Souter, Lori Pollock "A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis," Automated Software Engineering Conference (ASE), September 2004.
  • Sara Sprenkle, Sreedevi Sampath, Emily Gibson, Amie Souter, Lori Pollock, "An Empirical Comparison of Test Suite Reduction Techniques for User-session-based Testing of Web Applications,"International Conference on Software Maintenance (ICSM), September 2005.