1 / 30

BYU CS 340

BYU CS 340. Testing – Microsoft Office David Hansen TWC Test Manager. Quick Intro – David Hansen. 13 years (+2 internships) at Microsoft Started in Outlook (email, attachments, setup, automation) Now shared Office TWC team (security, performance, reliability, feedback) Education

Download Presentation

BYU CS 340

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. BYU CS 340 Testing – Microsoft Office David Hansen TWC Test Manager

  2. Quick Intro – David Hansen • 13 years (+2 internships) at Microsoft • Started in Outlook (email, attachments, setup, automation) • Now shared Office TWC team (security, performance, reliability, feedback) • Education • BS in Mechanical Engineering from BYU in 1992 • MBA from University of Washington in 2006 • Personal • Grew up in Reno, NV. • Married for 15 years to Wendy • 3 children (Hayley 13, Dillon 10, Mali 8) • Contact info: david.hansen@microsoft.com

  3. Point of my talk • Testing’s role in shipping commercial software • The kinds of challenges testers face • What makes a great tester Not (intentionally) • Recruiting for Microsoft

  4. Testing’s Role

  5. “Triad”

  6. Triad responsibilities Typical Office/Microsoft PM/Dev/Test team ratio is 0.5 : 1 : 1 More precise definitions at http://www.microsoft.com/college

  7. Office Organization

  8. Feature Crews Planning Implementation Verification Deliver Feature Code Deliver Test Automation Start Feature Crew Deliver Feature Specification Deliver Dev Design Doc Deliver Test Design Spec Deliver Detailed Feature Schedule Perform Code review Run check-in tests Test code on a private release build Check code into main branch • Micro-version of product lifecycle • Designed for quick spin of information between triad • Goal is ‘dogfoodable’ feature quality before checking into ‘main’ Spec Inspection Check-in Test Feature Crew Signoff Build Verification Test *Feature Development

  9. Office TWC Challenges Performance

  10. SharePoint Server Data Population • Data has significant impact on performance results • Scale and performance testing about pushing limits • How do you get all that data in there?

  11. SharePoint Server Data Population • Automation? • Disadvantages • Homogenous data sets. Lists all have similar fields • Stress and functionality test cramming data • Memory leak could bust you • Change in functionality/paths break test • Generating lots of data takes a long time. Time is your most valuable resource as a tester.

  12. SharePoint Server Data Population • Attach existing database? • Disadvantages • Toughest problem in developing a server product: upgrade • Product changes constantly break solution • End up writing lots of build x to build y migration code instead of new features

  13. Tester’s Role • Tester owns the solution • Drive any product changes with Dev/PM • Author tools as necessary • Validate process • Generate performance results that are consistent, reliable, and accurate • Then you can start entering bugs…

  14. Office TWC Challenges Security

  15. Fuzzing • Targeted attacks are more popular (going after assets instead of fame) • Office installed on a vast number of PCs (especially government/business) • Fuzzing (http://en.wikipedia.org/wiki/Fuzzing) has risen as popular tool for finding buffer overruns

  16. Fuzzing • Smart vs. Dumb fuzzing • Competing against lots of motivated people with more time. How do you scale? • Distribute the problem • Automate to deal with data explosion • Move the battlefront • Soldiers inside the walls • Sanitize (Wringer or MOICE) • Source analysis (catch it in the code)

  17. Tester’s Role • Aware and constantly learning about changing attack vectors • Build infrastructure and tools to allow scaling out • Validate data coming from the tools • Provide ideas on moving the battlefront • Validate the features • Try to break out of the box • Verify box doesn’t break workflow, functionality, performance, etc…

  18. Office TWC Challenges Reliability

  19. Watson

  20. Watson • Feature only works with Office in a bad state • How do you reproduce various states • Are you giving someone a broken fire extinguisher? • Validity of the data • Can developers fix the bug? • Minimal impact on user experience. They’re already mad. • Categorize crashes for decision making • No Personally identifiable information

  21. Tester’s Role • Write test plan • Automation • Validate data for developers and managers • Scale of the service • Tools • Simulate crash and doc recovery • Scale test the service • Service health monitoring • Process • Set goals for the organization • Drive investigation and action on the data • Help other testers across office use the data

  22. Are you fixing the right bugs? • Fixing bugs creates bugs • Beta feedback: • Verbose – Lots of data • Squeaky wheel effect with customers. Not democratic. • How can you tell if you’re making it better or worse?

  23. Watson Impact WinXP SP1example:

  24. Great testers

  25. Bug Stories • SharePoint survey and update feature • Offline store to local store migration • Null in filename • Office Min/Max functionality • Assembly code for exploit

  26. What are Key Tester Competencies? • Passion for software and technology • Curious about how things work • Raw smarts and quick learners • CS background and coding skill • Test mentality (malicious mindset) • Communication & collaboration skills • Deal with ambiguity & drive for results • Customer focus • Detail oriented and persistence • Broad and deep technical knowledge

  27. Trophy Case

  28. What are Key Tester Competencies? • Passion for software and technology • Curious about how things work • Raw smarts and quick learners • CS background and coding skill • Test mentality (malicious mindset) • Communication & collaboration skills • Deal with ambiguity & drive for results • Customer focus • Detail oriented and persistence • Broad and deep technical knowledge

  29. How Mike got there A genuine passion for the work we do. From the day I started in Testing I loved it. I didn’t just like it, I loved it. I love finding bugs. I love learning about how things are implemented and using that knowledge to break the software.  I love coding tools that help make it easier to find bugs faster and more efficiently (wEatmem, WinWhere). I knew only Pascal, Fortran and a little Basic when I came to MS.  I immediately set out teaching myself to program in C and write Windows apps after that.  I loved learning to code and I applied what I learned to writing simple games and tools (Pegged, Dr. BlackJack, WinWhere, wEatmem, etc).  I loved all of this.  I had an innate genuine passion for testing and coding and applying it to making software better.

  30. Questions?

More Related