1 / 29

Easy & V aluable , yet Rarely Implemented A spects of Performance Testing

Easy & V aluable , yet Rarely Implemented A spects of Performance Testing. Created for:. By: Scott Barber Chief Technologist PerfTestPlus, Inc. Easy & Valuable, yet Rarely Implemented Aspects of Performance Testing. Scott Barber.

dorie
Download Presentation

Easy & V aluable , yet Rarely Implemented A spects of Performance Testing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Easy &Valuable, yet Rarely ImplementedAspects of Performance Testing Created for: By: Scott Barber Chief Technologist PerfTestPlus, Inc.

  2. Easy & Valuable, yet Rarely ImplementedAspects of Performance Testing Scott Barber Chief Technologist, PerfTestPlus, Inc.sbarber@perftestplus.comwww.perftestplus.com @sbarber Co-Founder: Workshop On Performance and Reliabilitywww.performance-workshop.org Co-Author: Contributing Author: Author: Books: www.perftestplus.com/pubsAbout me: about.me/scott.barber

  3. Belief it is done at the “end” & only under load Low priority/visibility (until prod has issues) Dependency on pre-defined requirements No explicit goals for the testing effort Ignoring the simple or obvious “Unionizing” the testing Tool dependency Dismissal of “Common Sense” Top Performance Testing Pathologies Belief that only “Experts” can add value

  4. “Let’s face the truth, performance testing *IS* rocket science.” --Dawn Haynes … but even rocket science involves *SOME* easy stuff. --Addendum added by: Scott Barber

  5. What is Performance? • System or application characteristics related to: • Speed: - responsiveness- user experience • Scalability: - capacity- load- volume • Stability - consistency- reliability- stress

  6. What is Performance Testing? • What mom tells people: • I help people make websites go fast. • What I tell people: • I help and/or teach individuals and organizations to optimize software systems by balancing: • Cost • Time to market • Capacity • while remaining focused on the quality of service to system users.

  7. The Performance Lifecycle is: • Conception to Headstone • Not • Cradle to Grave

  8. Who is Responsible for Performance? Everyone

  9. “With an order of magnitude fewer variablesperformance testing could be a science,but for now, performance testing is at best a scientific art.” --Scott Barber

  10. So then, its hopeless?

  11. Fact: One does not need to bea performance testing rock starto have a significant positive impacton performance… …and thus add significant business-value… …quickly and simply.

  12. What we actually hope to gain by testing performance Are sometimes completely unrelated to stated requirements, goals, thresholds, or constraints Should be the main drivers behind performance test design and planning Usually indicate the performance-related priorities of project stakeholders Will frequently override good/bad in “go-live” decisions Value Begins with Clear Objectives

  13. Start with these questions: In Other Words… “What’ve we got? What do we want?How do we get there…?” --Bob Barber (Scott’s dad) … as quickly, simply, and cheaply as possible? --Addendum added by: Scott Barber

  14. Some Ways to Get Started Talk about how focusing on performance adds value and mitigates risk from “bar napkin to delete key”. Get performance in the dev, test, & delivery plans. Don’t let performance fall off the plate. Be the advocate, even if it makes you “annoying”. How is the performance today? How will this [change] effect performance? Ask during dev, test, & management meetings – even if you weren’t invited. Break out the puzzle!

  15. Build /Minor Release Development /Sprint • Goals/Budgets • Unit Tests/Trends • Story Acceptance • Build Validation/Regression • By Feature • Load (Comprehensive) • Peak Load (Comprehensive) • Weakness Identification Major ReleaseCandidate Production • Deployments • Maintenance • Rapid Response • Data Forward • External Integrations • Infrastructure/Tuning • Break Point Identification • Future Planning

  16. Super Simple Stuff Market Research Unit-Level Tests Acceptance Criteria Integration-Level Tests Field Studies Snapshots T4APM Jessica’s Story

  17. Conduct Market Research • Market Research: • http://www.keynote.com/keynote_competitive_research/ • http://gtmetrix.com/top1000.html • http://www.vertain.com/?peti • http://www.gomez.com/benchmarks/

  18. Unit-Level Testing • Tools: • FireBenchmarks; Performance testing addin for NUnit • JUnitPerf; a collection of JUnit test decorators for performance • Firefox Performance Tester's Pack • HTTPerf

  19. Field Studies & Acceptance Tests Ask Am I Annoyed? Is Anyone Else?

  20. Thoughts on Annoyance Why am I annoyed? How annoyed am I? Does this annoy me all the time, or just sometimes? What impact is this likely to have on product value? Advocate something better.

  21. Integration-Level Testing • Concepts: • Stick the pain chart in your Manual Test Scripts/Charters • Jot down response times from browser plugin • Build a “relationship table” associating times to annoyance, page/interaction by page/interaction • Put timers in BVTs & Automated Regression Suites: • Keep results in a “trend” file • “Flag” results that are “worse” than goal or “best” previous result • Don’t be afraid of stubs, mocks & virtual services… • … but don’t confuse them with the real deal!

  22. Field Studies • Field Study: • https://docs.google.com/spreadsheet/viewform?formkey=dDMzTUFQWnRsSW5sVDJsU0N1Z0tqMmc6MA

  23. Snapshots • Tools: • http://www.webpagetest.org/ • https://developers.google.com/speed/pagespeed/insights • http://webwait.com/ • http://www.websitepulse.com/help/tools.php • http://www.websiteoptimization.com/services/analyze/ • http://mobitest.akamai.com/m/index.cgi • http://www.gtmetrics.com

  24. Proactive • Micro & Macro • Establish Goals • Update Targets • Units • Stories • Tiers • Resources • Goals • Dev & Prod • Times • Resources • Sizes • Frequencies • Dashboard! • Assess • Compare • Investigate • Accept • Answer

  25. Trends are trendy! If you don’t have speed targets, don’t fret... This particular data comes from automated BVTs!

  26. So, what is Performance Testing? In effect: Performance testing helps stakeholders make decisions regarding product value and project risk; Specifically value and risk related to speed, scalability, and stability attributes of a system and it’s components throughout the product life-cycle.

  27. The Bottom Line For each goal, determine what information will answer: • Has this goal been achieved? • To what degree? • What needs to be done to achieve this goal? Decide what data must be collected to provide that information Figure out how to collect that data

  28. Review & Questions Did we learn anything?

  29. Contact Info Scott Barber Chief Technologist PerfTestPlus, Inc E-mail: sbarber@perftestplus.com Blog: scott-barber.blogspot.com Web Site: www.PerfTestPlus.com Twitter: @sbarber

More Related