Eric bodden laurie hendren patrick lam ondrej lhotak nomair a naeem
Download
1 / 18

Collaborative runtime verification with tracematches - PowerPoint PPT Presentation


  • 70 Views
  • Uploaded on

McGill University. Eric Bodden Laurie Hendren Patrick Lam Ondrej Lhotak Nomair A. Naeem. University of Waterloo. Collaborative runtime verification with tracematches. Problem. Ideally, runtime verification code should be included in deployed programs: Allows for easier debugging

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Collaborative runtime verification with tracematches' - kioko


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Eric bodden laurie hendren patrick lam ondrej lhotak nomair a naeem

McGill University

Eric Bodden

Laurie Hendren

Patrick Lam

OndrejLhotak

Nomair A. Naeem

University of Waterloo

Collaborativeruntime verificationwith tracematches


Problem
Problem

Ideally, runtime verification code should be included in deployed programs:

  • Allows for easier debugging

  • Actual usage vs. test case coverage

    Current runtime monitoring approaches do not scale well enough.

Here: Tracematches


A common programming problem
A common programming problem

Collection c =

Collections.synchronizedCollection(myC);

synchronized(c) {

}

Iteratori = c.iterator();

while(i.hasNext())

foo(i.next());


Tracematch asynciteration
Tracematch "ASyncIteration"

tracematch(Object c) {

sym sync after returning(c):

call(* Collections.synchr*(..));

sym asyncIter before:

call(* Collection+.iterator()) && target(c)

&& if(!Thread.holdsLock(c));

sync asyncIter {

System.err.println(

"Iterations over "+c+" must be synchronized!"

);

}

}



Static optimizations ecoop 2007
Static Optimizations (ECOOP 2007)

  • Quick check:

    Eliminate incomplete tracematches

  • Pointer analysis:

    Retain “consistent sets of instrumentation points”

    Brings overhead under 10% in most cases.

    However, some overheads still exceed 150%!

    Goal: 10% overhead in all cases


Collaborative runtime verification with tracematches

Spatial partitioning

Collaborative runtime verification


Spatial partitioning in detail
Spatial partitioning in detail

First of all, identify multiple probes:

  • A set of instrumentation points (shadows) that could potentially lead to a match

  • Find such sets of shadows using flow-insensitive points-to analysis


Identifying probes
Identifying probes

asyncIter(c=c3)

asyncIter(c=c2)

o2

o1

sync(c=c1)

Probe



Collaborative runtime verification with tracematches

Temporal partitioning

Problem: Hot shadows


Could switching probes on and off lead to false positives
Could switching probes on and off lead to false positives?

  • No, we can safely enable a probe anytime due to tracematch semantics.

    Opposed to e.g. LTL always match against a suffix of the execution trace.

  • Can also disable anytime.

    Just have to make sure we discard bindings.

*

skip(aSyncIter)

sync

aSyncIter


Code generation for probe switching
Code generation for probe switching

0

1

2

3

4

0

0

1

asyncIter(c=c4)

asyncIter(c=c3)

asyncIter(c=c2)

2

1

3

sync(c=c1)

sync(c=c5)

sync(c=c1)

2

0

4


Benchmarks
Benchmarks

  • ECOOP ’07 benchmarks with largest overheads

  • Ran each benchmark/tracematch combination with one probe enabled at a time

  • Measured relative runtime overhead


Overheads after spacial partitioning
Overheads after spacial partitioning


Future work
Future work

  • Implement temporal partitioning

    • Requires probabilistic foundation

  • Try this out on a larger scale

    • Need Java programs with a large user base, willing to cooperate

  • Try using JVM support to find hot probes

    • Production JVMs already compute statistics

    • Would enable more efficient probe switching

  • Eliminate super-hot shadows through better static analysis


Conclusion
Conclusion

  • Sound collaborative RV is possible using tracematches

  • Can construct probes using a flow-insensitive points-to analysis

  • Approach works for some programs but very hot shadows can still be bottlenecks

  • Found a heuristic to statically identify shadows with potentially high runtime impact

  • Further static optimizations probably more promising


Thank you
Thank you

Thank you for listening and the entire AspectBench Compiler group for their enduring support!

Download our tool, examples and benchmarks at:

www.aspectbench.org