1 / 6

Alfresco Benchmark Framework

Alfresco Benchmark Framework. Derek Hulley Repository and Benchmark Team. Some History. 2008: Simple RMI-based remote loading of FileFolderService Unisys 100M document repository-specific test Then: QA wrote JMeter scripts for specific scenarios

weberedwin
Download Presentation

Alfresco Benchmark Framework

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Alfresco Benchmark Framework Derek Hulley Repository and Benchmark Team

  2. Some History • 2008: • Simple RMI-based remote loading of FileFolderService • Unisys 100M document repository-specific test • Then: • QA wrote JMeter scripts for specific scenarios • Customers, partners and field engineers provided tests for specific scenarios • Hardware shared with QA and dev, as required • Mid 2011: • RackSpace benchmark environment commissioned • Late 2011: • Failed attempt to simulate real Share-Repo interaction using JMeter (AJAX, etc) • First major proof of 4.0 architecture started (JMeter with sequential, heaviest calls) • Later called BM-0001 • Client load driver was resource intensive and results had to be collated from 3 servers • Early 2012: • Benchmark Projects Lead role created • Evaluation of Benchmarking technology • Mid 2012: • Successful approximation of Share-repo interaction using JMeter • Benchmark Projects formalized • BM-0002 executed (ongoing for regression testing) • BM-0009 started

  3. (Some of the) Benchmark Framework Requirements • Real browser interaction • Javascript, asynchronous calls, resource caching, etc • Scaling • Scale to thousands of active sessions • Elastic client load drivers • Shared thread pools • Results • Durable and searchable results • Support real-time observation and analysis of results • Every logical action must be recorded • Every result (positive or negative) must be recorded • Tests • Treated as real software (automated integration testing, major and minor versions, etc) • Reusable code or components • Aware of server state • Different tests can share the same data set up • Execution • Remote control from desktop • Override default test properties • Concurrent test execution • Start / Stop / Pause / Reload • Automated reloading

  4. Benchmark Server Architecture Client Configuration Reporting MongoDB MongoDB MongoDB Test Run Event Queues Test Run Results Data Mirror Collections ZooKeeper Server configurationTest DefinitionsTest run definitions Benchmark Server 1 Thread PoolCommon Libraries eg. WebDriver Benchmark Server N Thread PoolCommon Libraries e.g. WebDriver Test Target

  5. Demo

  6. Modifying Test Parameters During Run HTTP connection pool refreshing Paused test Continued test Doubled workflow rate Halved workflow rate

More Related