1 / 20

Cross feature testing in database systems

Cross feature testing in database systems. Sung Hsueh , Arvind Ranasaria Microsoft SQL Server Microsoft Corp. Overview. Well known problem Not well solved Large problem space Open challenge (not fully defined). Definition.

anne
Download Presentation

Cross feature testing in database systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cross feature testing in database systems Sung Hsueh, ArvindRanasaria Microsoft SQL Server Microsoft Corp SIGMOD DBTest 2008

  2. Overview • Well known problem • Not well solved • Large problem space • Open challenge (not fully defined) SIGMOD DBTest 2008

  3. Definition • Software testing that focuses on exposing issues which occur only when two or more features are used together SIGMOD DBTest 2008

  4. Challenge • How many combinations need to be tested? • For a database with just 50 features, this yields 1125899906842573 combinations! SIGMOD DBTest 2008

  5. Challenge (cont’d) • Easy to miss combinations • Consider KB# 933265 • Error 1203 on INSERT • Requires: • Table with identity column • INSERT statement • Parallel INSERT statement • Multi-proc machine SIGMOD DBTest 2008

  6. Traditional Strategies/Techniques • Big Bang • Top-Down/Bottom-Up • Pairwise • Code analysis • Related Areas: • Interaction Testing • Combinatorial Testing SIGMOD DBTest 2008

  7. So….why are we here? • So what’s wrong with these strategies? • Well, nothing really • Good at reducing problem space, reducing complexity • BUT • Not cross feature specific • Not database specific SIGMOD DBTest 2008

  8. Cross feature specific • Take into account relationships between features • should define how they are tested • Correlate relationship type with issue type • should define severity and priority SIGMOD DBTest 2008

  9. Database specific • Database is a platform • Defined external API’s • Component layer boundaries • Shared object space SIGMOD DBTest 2008

  10. Interaction Types • Independent • Orthogonal • Non-orthogonal • Exclusive SIGMOD DBTest 2008

  11. Independent • Isolated in code • Isolated in behavior • Does not even use the same objects • Must test assumptions • Example: • Aggregate function and version number SIGMOD DBTest 2008

  12. Orthogonal • Isolated in behavior • Can have code overlap • Can use shared objects • Easy to make the wrong assumptions, must be validated • Example: • NVARCHAR and VARCHAR SIGMOD DBTest 2008

  13. Non-Orthogonal • Identified Behavior Change • Using one feature will affect how another feature works • Example: • Indexing and query optimizing SIGMOD DBTest 2008

  14. Exclusive • Features can not work together at all • Using one feature requires the other feature to be disabled • Must verify that feature is appropriately disabled or blocked from being enabled • Example: • Single user mode and client connections SIGMOD DBTest 2008

  15. Failure Types • Error • Unexpected Behavior • Performance SIGMOD DBTest 2008

  16. Error • Easy to identify: will be an error message, crash, access violation, assertion, etc • May mask other issues (i.e. this is the “obvious” symptom) • Severity based on type of error • High priority as this is highly visible SIGMOD DBTest 2008

  17. Unexpected Behavior • May be harder to diagnose as issue is less obvious • Might be missed if test misses some aspect of validation • Can result in security issues • Severity based on behavior change • Priority based on behavior change SIGMOD DBTest 2008

  18. Performance • Can be related to behavior changes • Should be relatively easy to spot with the right performance metrics • Not necessarily speed (i.e. compression) • Severity can be identified by perf regression • Priority depends on the feature involved SIGMOD DBTest 2008

  19. What Next? • How can we use these interaction categories to come up with new strategies and techniques in test design? • Can we use these failure types to better validated our tests/automate test verification • How can we architect the database to be more agile to cross feature issues? SIGMOD DBTest 2008

  20. Questions? • Contact: • Sung Hsueh: sung.hsueh@microsoft.com • ArvindRanasaria: arvind.ranasaria@microsoft.com SIGMOD DBTest 2008

More Related