1 / 36

Paruj Ratanaworabhan, Ben Livshits, and Ben Zorn

JSMeter : Characterizing the Behavior of JavaScript Web Applications. Paruj Ratanaworabhan, Ben Livshits, and Ben Zorn. In collaboration with David Simmons, Corneliu Barsan, and Allen Wirfs-Brock. Why Measure JavaScript?. Standardized, de facto language for the web

rocio
Download Presentation

Paruj Ratanaworabhan, Ben Livshits, and Ben Zorn

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. JSMeter: Characterizing the Behavior of JavaScript Web Applications Paruj Ratanaworabhan,Ben Livshits, and Ben Zorn In collaboration with David Simmons, Corneliu Barsan, and Allen Wirfs-Brock

  2. Why Measure JavaScript? • Standardized, de facto language for the web • Support in every browser, much existing code • Major Microsoft investment • Web Application Companion based on Script# • Benchmarks are driving browser marketing: “browser wars” • Are benchmarks representative? • Limited understanding of its behavior in real sites • Who cares? • Users, web site designers, JavaScript engine developers

  3. ZDNet 29 May 2008 ghacks.net Dec. 2008

  4. Artificial Benchmarks versus Real World Sites JSMeter • 7 V8 programs: • richards • deltablue • crypto • raytrace • earley-boyer • regexp • splay • 8 SunSpiderprograms: • 3-draytrace • access-nbody • bitops-nsieve • controlflow • crypto-aes • date-xparb • math-cordic • string-tagcloud 11 real sites • Goals of this Talk • Show behavior of JavaScript benchmarks versus real sites • Consider how benchmarks can mislead design decisions Maps Maps

  5. Instrumentation Framework Source-level instrumentation \ie\jscript\*.cpp custom jscript.dll website visits custom trace files Offline analyzers custom trace files

  6. Visiting the Real Sites Attempted to use each site in “normal” way:

  7. Understanding JavaScript Behavior

  8. Code Behavior • Function size • Instructions/call • Code locality • Instruction mix

  9. Code|Objects|Events Total Bytes of JavaScript Source Real Sites V8 SunSpider

  10. Code|Objects|Events Static Unique Functions Executed Real Sites V8 SunSpider

  11. Code|Objects|Events Total Bytecodes Executed 90395 Real Sites V8 SunSpider

  12. Code|Objects|Events Bytecodes / Call function(a,b) {vari=0,elem,pos=a.length;    if(D.browser.msie) {        while(elem=b[i++])           if(elem.nodeType!=8)                 a[pos++]=elem;    }  else            while(elem=b[i++])                 a[pos++]=elem;    return a} Real Sites V8 SunSpider

  13. Code|Objects|Events Fraction of Code Executed Most codenot executed Real Sites V8 SunSpider

  14. Code|Objects|Events Hot Function Distribution 80% of time in 100+ functions 80% of time in < 10 functions 80% 80% Real Sites V8 Benchmarks

  15. Code|Objects|Events Opcode Distribution Outliers Real Apps Green = SunSpider | Blue= Real Web Apps | Red = V8

  16. Object Allocation Behavior • Allocation by types • Lifetime distribution • Live heap composition

  17. Code|Objects|Events Total Bytes Allocated Real Sites V8 SunSpider

  18. Code|Objects|Events Heap Data by Type Few benchmarks allocate much data Many functions Rest are strings Real Sites V8 SunSpider

  19. Code|Objects|Events Object Type Distribution Real Apps economist is an outlier(arrays)

  20. Code|Objects|Events Live Heap Over Time (gmail) Functions growsteadily GC reduces size of heap Objects growsteadily too

  21. Code|Objects|Events Live Heap over Time (ebay) Heaps repeatedly created, discarded Heap containsmostly functions Heap drops to 0 on page load

  22. Code|Objects|Events Two Search Websites Bing(Web 2.0) Google (Web 1.0)

  23. Code|Objects|Events Two Mapping Websites Google Maps Bing Maps

  24. Code|Objects|Events Object Lifetimes (V8) Lives forever Lives untilprogram ends Dies immediately splay earley

  25. Code|Objects|Events Object Lifetimes (gmail) Many objectslive a major fraction ofpage lifetime Functions are longest lived This is as old as an object can be Strings are shortest lived

  26. Event Handlers in JavaScript • Number of events • Sizes of handlers

  27. Code|Objects|Events Event-driven Programming Model • Single-threaded, non-preemptive event handlers • Example of some standard event-handler attributes: • onabort, onclick, and onmouseover • Very different from batch processing of benchmarks • Handler responsiveness critical to user experience

  28. Code|Objects|Events Total Events Handled Almost no events Real Sites V8

  29. Code|Objects|Events Median Bytecodes / Event Handled

  30. Code|Objects|Events Maximum Bytecodes / Event Handled

  31. Code|Objects|Events Distribution of Time in Handlers

  32. A Simple Experiment • Observation • Real web apps have lots of code (much of it cold) • Benchmarks do not • Question: What happens if the benchmarks have more code? • We added extra, unused to code to 6 SunSpider benchmarks • We measured the impact on the benchmark performance

  33. Performance Impact of Cold Code Without cold code, Chrome12x faster than IE8With 2M cold code,Chrome 4.7x faster than IE8Which comparison is more meaningful? Chrome IE 8 Cold code makesSunSpider on Chrome up to 4.5x slower Cold code has non-uniform impact on execution time

  34. Impact of Benchmarks • What gets emphasis • Making tight loops fast • Optimizing small amounts of code • Important issues ignored • Garbage collection (especially of strings) • Managing large amounts of code • Optimizing event handling • Considering JavaScript context between page loads

  35. Conclusions • JSmeter is an instrumentation framework • Used to measure and compare JavaScript applications • High-level views of behavior promote understanding • Behavior in benchmarks significantly differs from real sites • Misleads designers, skews implementations • Next steps • Develop and promote better benchmarks • Design and evaluate better JavaScript runtimes • Promote better performance tools for JavaScript developers

  36. Additional Resources • Project • http://research.microsoft.com/en-us/projects/jsmeter/ • Papers • "JSMeter: Measuring JavaScript Behavior in the Wild", Paruj Ratanaworabhan, Benjamin Livshits, David Simmons, and Benjamin Zorn, MSR-TR-2010-8, January 2010 (17 pages) • "JSMeter: Characterizing Real-World Behavior of JavaScript Programs", Paruj Ratanaworabhan, Benjamin Livshits, David Simmons, and Benjamin Zorn, MSR-TR-2009-173, December 2009 (49 pages) • “An Analysis of the Dynamic Behavior of JavaScript Programs”, Gregor Richards, Sylvain Lebresne, Brian Burg, Jan Vitek (to appear in PLDI 2010)

More Related