1 / 27

Precisely Selecting Regression Test for Aspect-Oriented Programs

Precisely Selecting Regression Test for Aspect-Oriented Programs. Guoqing Xu The Ohio State University xug@cse.ohio-state.edu. Outline. Big picture and background Problem statement and motivation Our analysis Implementation status Related work. Big Picture.

mahlah
Download Presentation

Precisely Selecting Regression Test for Aspect-Oriented Programs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Precisely Selecting Regression Test for Aspect-Oriented Programs Guoqing Xu The Ohio State University xug@cse.ohio-state.edu

  2. Outline • Big picture and background • Problem statement and motivation • Our analysis • Implementation status • Related work

  3. Big Picture • AOSD improves separation of concerns • Challenges for analysis of AO programs • Control flow in bytecode is not equivalent to the program logic in source code • Compiler-specific weaving • Dynamic/static parts of pointcut designators • No analysis framework implementation available • Challenges for testing of AO programs • Coverage criteria; test generation for exercising of aspects; regression test selection

  4. Big Picture • Long-term goals • Framework that supports static analyses similar to popular analyses for OO program • Develop AO-specific techniques for analysis and testing • Topic of this talk • Regression test selection for AO programs • Work in progress

  5. Regression Test Selection • Testing after software modifications • Re-running the entire regression test suite is expensive • Select a subset of tests to run • Safe test selection chooses every test case that may reveal a fault

  6. Related Previous Work • Rothermel and Harrold, TOSEM 97 • Graph traversal algorithms • Harrold et al., OOPSLA 01 • Java interclass graph (JIG) Program P Execute P/ Record Coverage Program P’ P’s Edge Coverage Matrix Select Tests Dangerous Entities containing edges in P Identify Dangerous Entities Program P Program P’ Copied from [Harrold et al., OOPSLA 01]

  7. Regression Testing for AOSD • When AO features are added/modified, the program needs to be regression tested • Case 1: P is an OO program and P’ is an AO version of P • Case 2: Both P and P’ are AO programs • How should regression test selection be performed for AO software?

  8. Outline • Big picture and background • Problem statement and motivation • Our analysis • Implementation status • Related work

  9. Existing Work Applied to AO Programs • How to compare two JIGs? • The JIG of the woven code includes redundant nodes and edges and does not correspond to the logical control flow as presented in the source • Need new representations • How to recover CFG edges from the execution trace when computing edge-coverage matrix? (when P is AO program) • The execution trace is compiler-specific • Need instrumentation before/during weaving

  10. bar() C.m() p.m() … return exit exit Java Interclass Graph (JIG) Existing Work Applied to AO Programs • class C { • public void m(int i) {…} • 3. } • void bar(C p, int k) { • p.m(k); • } • Java version P • CFG edge • Call edge

  11. bar() bar() ..Sample.around$0 ..around$0 C.m() … p.m() return … c.m() C.m () Return … return exit exit JIG for P’ (from woven code) JIG for P Example • class C { • public void m(int i){…}; • 3. } • void bar(C p, int k) { • p.m(k); • } • aspect Sample{ • void around(C c, int i): • call(C.m(int)) && • target(c) && • args(i) { • 14. proceed(p , x); • 15. } AspectJ Version P’

  12. Some Results • When we applied the [Harrold et.al. 01] algorithm to several subjects:

  13. Possible Approach • Create “clean” CFGs in which the wrapper code inserted during weaving is removed • Graph traversal and comparison corresponds to the “logical” structure of the code, not the compiler-specific woven code • New representation: AJIG • AspectJ Inter-module Graph – more later • For regression test selection, need to consider additional issues

  14. bar() System.out…. m() C.m () return … exit exit AJIG for P’ • Do we need to select all the tests that go through the edge marked in red? A More Complex Problem • class C { • public void m(int i){…}; • 3. } • void bar(C p, int k) { • C.m(k); • } • aspect Sample{ • void around(C c, int i): • call(C.m(int)) && • target(c) && • args(i) { • System.out.println(i); • proceed(p , x); • 15.}

  15. Why it needs to be addressed • This is an issue not only for AO software, but also for procedural and OO software • Advices are often free of side effects • Study in [Rinard et al. FSE 04] reported 6 “observer” advicesout of ten inspected advices • Recommended for “safe” AO programming • Adding side-effect-free advices should not result in overly conservative regression test selection • Approach: use side-effect analysis

  16. Outline • Big picture and background • Problem statement and motivation • Our analysis • Implementation status • Related work

  17. Our Work • Consider both situations: • Case 1: P is an OO version, P’ is an AO version • Case 2: both P and P’ are AO versions • Analysis to select regression tests • Build a new control flow representation: AJIG • Apply existing graph-traversal algorithm onAJIG • Side-effect analysis when comparing AJIGs

  18. AJIG • AspectJ Inter-module Graph (recent work) • For the Java parts, same as JIG. • Shadow node • A shadow node is associated with • a set of JIGs of advices • the precedence of these advices • an integrated shadow advice JIG • AJIG supports allstatic AspectJ join point types • Conservatively approximate the dynamic part of pointcut designators.

  19. Side Effect • Side effect node • Mutate the objects that have existed before the method/advice is invoked For an advice, • All the nodes in the CFG path that does not contain a proceed node • The proceed node has side effect if its actual parameters and parameters of the advice don’t point to same objects. • Mutates the object returned by the proceed node • The return node has side effect if the return value of the advice and that of the proceed node don’t point to the same object

  20. Side Effect (Cont’d) • Side effect related node in AJIG • Has side effects • Has some dependency on the nodes that have side effects • Safe edge in AJIG • An edge is safe edge, if the sink node of this edge is not side effect related.

  21. New Test Selection criterion • New Test Selection criterion: • Dangerous set S computation: for each edge e in P, and its counter part e’ in P’ • e is not equivalent to e’and • Neither e nor e’ is a safe edge. • Select a test that execute one or more edges in S

  22. Selection • Computing dangerous set S by comparing AJIGs. • We plan to use some form of side-effect analysis • Large body of existing work • Selecting tests

  23. Implementation progress • The implementation of algorithms described in [Harrold et.al. 01] √ • Building AspectJ Inter-module Graph √ • Make an extension for the abc compiler that generates the Jimple based CFGs for aspects between the weaving of inter-type declarations and advices. • Instrument advices at different phases. • Side effect analysis ongoing work • Evaluation

  24. Evaluation Plan • Benchmarks: ten to twenty moderate size (50-100 classes) AO projects taken from tmbenches, used by the abc compiler. • Experiments • collect data. • The number of tests selected over versions. • The comparison between enabling side-effect analysis and disabling it.

  25. Related Work • Static/Dynamic Analysis for AO programs. • abc compiler [AOSD 05] [PLDI 05] [TR 04]. • Static analysis of aspects [Sereni and Moor, AOSD 02]. • Zhao’s work on the analysis and testing of AO programs [COMPSAC 03] [WPC 02][AOSD 06]. • Classification system for AO programs [Rinard et. al. FSE 04].

  26. Related Work (Cont’d) • Regression Test Selection • [LW ICSM 91], [CRV ISCE 94], [RH TOSEM 97], [Ball ISSTA 98], [Harrold+ OOPSLA 01], [OSH FSE 04]… • Change Impact Analysis • [KGH+ ICSM 94], [RT PASTE 01], [OAH FSE 03], [OAL+ ICSE 04], [RST+ OOPSLA 04], …

  27. Thank you!! Questions??

More Related