1 / 22

Paruj Ratanaworabhan, Ben Livshits, David Simmons, and Ben Zorn

JSMeter : Characterizing the Behavior of JavaScript Web Applications. Paruj Ratanaworabhan, Ben Livshits, David Simmons, and Ben Zorn. In collaboration with Corneliu Barsan, and Allen Wirfs-Brock. Why Measure JavaScript?. Standardized, de facto language for the web

heinrich
Download Presentation

Paruj Ratanaworabhan, Ben Livshits, David Simmons, and Ben Zorn

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. JSMeter: Characterizing the Behavior of JavaScript Web Applications Paruj Ratanaworabhan, Ben Livshits, David Simmons, and Ben Zorn In collaboration with Corneliu Barsan, and Allen Wirfs-Brock

  2. Why Measure JavaScript? • Standardized, de facto language for the web • Support in every browser, much existing code • Major Microsoft investment • Web Application Companion based on Script# • Benchmarks are driving browser marketing: “browser wars” • Are benchmarks representative? • Limited understanding of its behavior in real sites

  3. ZDNet 29 May 2008 ghacks.net Dec. 2008

  4. Artificial Benchmarks versus Real World Sites JSMeter • 7 V8 programs: • richards • deltablue • crypto • raytrace • earley-boyer • regexp • splay • 8 SunSpiderprograms: • 3-draytrace • access-nbody • bitops-nsieve • controlflow • crypto-aes • date-xparb • math-cordic • string-tagcloud 11 real sites • Goals of this Talk • Show behavior of JavaScript benchmarks versus real sites • Consider how benchmarks can mislead design decisions Maps Maps

  5. Instrumentation Framework Source-level instrumentation \ie\jscript\*.cpp custom jscript.dll website visits custom trace files Offline analyzers custom trace files

  6. Understanding JavaScript Behavior

  7. Code Behavior • Function size • Instructions/call • Code locality • Instruction mix

  8. Instructions Executed per Call Code|Objects|Events function(a,b) {vari=0,elem,pos=a.length;    if(D.browser.msie) {        while(elem=b[i++])           if(elem.nodeType!=8)                 a[pos++]=elem;    }  else            while(elem=b[i++])                 a[pos++]=elem;    return a} 2,747,484 591 9,743 1,792 874 Real sites SunSpider V8

  9. Object Allocation Behavior • Allocation by types • Lifetime distribution • Live heap composition

  10. Total Memory Allocation by Type Code|Objects|Events V8 Real Applications • Types in real applications are: • Different than in the benchmarks • Functions play a larger role • Objects play a smaller role

  11. Code|Objects|Events Live Heap Over Time (gmail) Strings are transient Functions, objects growsteadily

  12. Code|Objects|Events Live Heap over Time (ebay) Heap containsmostly functions Heaps repeatedly created, thendiscarded

  13. Event Handlers in JavaScript • Number of events • Sizes of handlers

  14. Code|Objects|Events Event-driven Programming Model • Single-threaded, non-preemptive event handlers • Example of some standard event-handler attributes: • onabort, onclick, and onmouseover • Very different from batch processing of benchmarks • Handler responsiveness critical to user experience

  15. Number of Events Handled Code|Objects|Events 6 6 9 6 Real sites V8

  16. Impact of Benchmarks • What gets emphasis • Making tight loops fast • Optimizing small amounts of code • Important issues ignored • Garbage collection (especially of strings) • Managing large amounts of code • Optimizing event handling • Considering JavaScript context between page loads

  17. Conclusions • JSmeter is an instrumentation framework • Used to measure and compare JavaScript applications • Behavior in benchmarks significantly differs from real sites • Misleads designers, skews implementations • Next steps • Develop and promote better benchmarks • Use insights from our measurements to guide designers • Make measurement infrastructure broadly available to JavaScript developers

  18. Object Lifetimes Heap Content google (Web 1.0) bing (Web 2.0)

  19. Backup Slides

  20. Total Bytes Allocated V8 Real Applications Many benchmarks allocate little data

  21. Sizes of Handlers Code|Objects|Events bingmap gmail facebook Max = 1,107,532 Median = 314 Max = 594,482 Median = 506 Max = 89,785 Median = 380 crypto Max = 86,000,000 Median executed handler instructions Maximum executed handler instructions

More Related