1 / 44

Shared Secrets of WDF – Part 2 Testing WDF

Shared Secrets of WDF – Part 2 Testing WDF . Robert Kjelgaard Shyamal Varma Wei Mao . Agenda. Bob Kjelgaard Introduction DDI Testing Stress / Fault Injection /concurrency Whitebox techniques Shyamal Varma State Machine testing Wei Mao Performance Testing Versioning Testing

cain
Download Presentation

Shared Secrets of WDF – Part 2 Testing WDF

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Shared Secrets of WDF – Part 2 Testing WDF Robert Kjelgaard Shyamal Varma Wei Mao

  2. Agenda Bob Kjelgaard • Introduction • DDI Testing • Stress / Fault Injection /concurrency • Whitebox techniques Shyamal Varma • State Machine testing Wei Mao • Performance Testing • Versioning Testing Bob, Shyamal, and Wei • Summary

  3. What does WDF QA Do? • Participate in product design and monitor implementation • Assess product quality • Advocate for customer • Develop , use, and maintain test and reporting systems • We develop test software and tools • We like to break stuff, but • We are not software testers

  4. We Test WDF on Many Platforms 1.9 1.7 1.5

  5. Product Testing Is Essential • When do we test? • Constantly • How do we test? • Many ways, automated and manual • How do we automate tests? • By design and our own ingenuity and creativity where we reuse • Is automation expensive? • Yes, but worth it • How do we develop and maintain test code? • Just like any piece of software

  6. Some of our problems are unique (so the same is true of our solutions) • We are testing generalized frameworks for drivers • Not a driver or even a driver class • We cannot write all possible drivers • But we must ensure to the highest level possible that our product can be used for many kinds of drivers • Our product includes complicated internal state machines • We have to validate our packaging and versioning mechanisms • We must keep tabs on performance • We do not have unlimited resources • So we have to use our resources wisely and effectively

  7. Testing the DDI : Design Goals • The DDIs are primary developer interface to WDF • It is imperative they work exactly as they should or WDF value is weakened • Broad coverage • It’s more than just parameters, it’s also when and from where you call • Automated effectively, run every day • Easily modifiable test harness • ~400 KMDF DDI • ~200 UMDF DDI • More with each new version • Reliability, reproducibility and diagnosability are paramount

  8. Design Secrets of Our DDI Tests • Reducing design and implementation overhead of control • KMDF—most drivers are scriptable ActiveX controls • KMDF—initialization-time DDIs use property bags for control and reporting • UMDF—use named pipes for control • Effectively handle stops, bugchecks, and breaks • KMDF—we use IAT hooks to convert to exceptions • UMDF—special hooks built into the core so stops do not occur • Leverage device simulation framework (DSF) to reduce hardware dependencies and increase reach • Example: USB device with 255 interfaces, or 15 pipes or no pipes (boundary cases)

  9. Typical DDI test flow using ActiveX-style COM automation Script Script creates ActiveX Object Script calls method of Object to perform a test Script determines , logs and reports pass / fail status Execution Flow User-mode proxy issues object creation IOCTL Proxy marshals parameters into a private IOCTL Marshalling Core Proxy unpacks and returns the results User Kernel KM server uses test driver’s object factory to create object Server stub unpacks the IOCTL and issues the call Server marshals the results into the IOCTL return buffer Driver Test Driver initializes interface with KM server Test Driver’s factory creates the object Test Driver performs requested call

  10. Concurrency, Fault Injection and Stress • Driver Verifier low-resolution simulation • Our own adaptive fault injection tools • This approach is available to you in WdfTester tool • Random testing built into drivers • Self-testing stress drivers • Software bus drivers instead of root enumeration • Multithreaded apps with overlapped I/O and multiple handles

  11. White-box Techniques for That Last Bit of Coverage • PDB-based techniques to directly access internals • PDB can give RVA of any symbol (data or code) in binary • It also can return correct offsets for fields within a structure • We get module base address and size using documented DDI • Test app can pass RVAs and binary to KM driver, which resolves them • IAT Hooking to manipulate the OS side of runtime. • Similar to user mode Detours package from MS research (no trampolines) • High maintenance and risky—used sparingly. • But it beats giving up.

  12. WDF State Machine TestingShyamal Varma

  13. PnP/Power State Machine Testing • The framework manages device PnP and power-related events with state machines • PnP state machine (about 55 states and 112 transitions) • Power state machine (about 82 states and 168 transitions) • Power policy state machine (about 124 states and 246 transitions) • State machines are complex and contain large number of states

  14. A Peek at the State Machines

  15. PnP/Power State Machine Test : Goals • Generate state machine model diagrams programmatically from state machine source code • Allows comparison with specification • Easier to verify changes to code • Perform basic validation of the state machine at build time • Look for dead states • Obtain state machine state and transition coverage information • Obtained while running tests • Ensure that we have coverage as close to 100% as possible • Add new tests to cover test holes

  16. Generating State Machine Diagrams • A graph visualization tool is used that makes images (jpeg) out of a description in simple text language • A tool parses state machine-related code from WDF source tree and generates the text file for the graph visualization tool

  17. Checking for Dead States • “Dead state” =No state transition leading out of the state • Check for dead states while building the framework code • Spec explorer tool from Microsoft Research.

  18. Checking for Dead States • Spec Explorer • A tool for model-based testing • Encode a system’s intended behavior (specification) in machine executable form (a model program) • Model program written in Spec# • We use a small subset of this tool’s functionality to check for dead states.

  19. Checking for Dead States • A tool parses state machine code from the WDF source tree and generates Spec# code • This spec# code is passed to the spec explorer engine to check for dead states • Dead states will prevent framework code from building (compiling)

  20. Obtaining Coverage Information • WDF state machine transition information is logged while running various tests • A tool reads this data and generates a text file containing coverage statistics • State and state transition coverage information obtained • Graph visualization tool can be used to visualize coverage data

  21. Performance TestingVersioning TestingWei Mao

  22. Performance Test - Goal • Measure and save performance data • Data transfer throughput • Latency • Build-to-build compare • Flag any performance regression • Capture other kernel events for future analyzing • Fully automated, runs regularly • Unified reporting among multiple test applications / drivers

  23. Performance Test - Goal • Implementation consideration • Compare result on same machine, same OS • Consistent result with multiple runs and calc average • Extra indices: CPU utilization %, memory set variation • Cover both high/low end machines

  24. Performance Test – Test Application • Open the test device, write a pattern to it, read it back and verify it • Log the result using event tracing for Windows (ETW) • Configuration Options • Multi threads: 1, 2, 4, 10 • Chunk size: 10 bytes, 256, 4096

  25. Performance Test - KMDF Test Driver • Configuration Options • IO type: buffered, direct, neither • Execution level: passive, dispatch • Synchronization scope: device, queue, none • Register preprocess callback: escape to WDM • Optional Upper Filter Driver • CreateDefaultQueue or not • Forward and forget vs. reformat, forward w/ completion routine • WDM Test Driver • Same configuration options as KMDF

  26. Performance Test - UMDF Test Drivers • Various test drivers • Memory copy • Forward to Win32 file I/O • UMDF filter above KMDF stack • KMDF stack can be memory copy or fixed transfer rate device • Various configuration s • Parallel / sequential queue • Multiple queues • Locking mode • I/O type: sync and async, etc.

  27. Performance Test - Reporting • The same ETW schema is used among different event providers • Trace is captured with Xperf • ETL is converted to XML with Tracerpt • Compare with historic data given certain allowed margin • Generate HTML report • Flag performance regression

  28. Versioning Test • Goal • Verify both the WDF co-installer and WDF hotfix install correctly on all supported platforms • Cover broad test matrix • Non-goal • API/DDI compatibility – which is verified separately

  29. Versioning – Test Matrix Execution • We have many installation scenarios to cover • Primarily done manually through WDF 1.7. • Each installation scenario must begin with a clean machine • For example: Install 1.9, reimage, then upgrade 1.7 to 1.9 • System Restore is used to minimize reimage time • For effective control, we wrote our own test scenario execution scheduler • Input: scenario description file in XML • Save and resume execution context upon reboot • Benefit: the scenario is self-documented

  30. Summaries

  31. Our summary • WDF is heavily tested and more testing is added regularly • The primary tools and techniques we use in test development and execution are available to you in the WDK • Others, such as Driver Verifier and App Verifier, are readily available • [KMDF 1.9] If you test with Driver Verifier, you get KMDF verifier for free • We have WDF-specific tools in the WDK • WdfTester lets you do some of the same things we do • Call logging and tracing • Adaptive (or other targeted forms of) fault injection • WdfVerifier is there to make your testing easier

  32. Call to Action • Use PFD, SDV and compile to W4 on your driver code • Use Driver Verifier and App Verifier in testing • Use WDF! • Use WdfTester and WdfVerifier on your WDF drivers • Feedback on WDF-specific tools (and any WDF quality concerns): wdfinfo@microsoft.com • We can’t act effectively on your behalf without your input

  33. Additional Resources • Web Resources • Hardware and Driver Developer Community http://www.microsoft.com/whdc/resources/default.mspx • Blogs http://www.microsoft.com/whdc/resources/blogs.mspx • Spec Explorer http://research.microsoft.com/projects/specexplorer/ • AppVerifierhttp://technet.microsoft.com/en-us/library/bb457063.aspx • WDK Documentation • WdfVerifier Tool http://msdn.microsoft.com/en-us/library/cc264238.aspx • WdfTester Tool http://msdn.microsoft.com/en-us/library/cc264231.aspx • Driver Verifier http://msdn.microsoft.com/en-us/library/ms792872.aspx • DSF http://msdn.microsoft.com/en-us/library/aa972916.aspx • Debugging UMDF drivers http://msdn.microsoft.com/en-us/library/aa510991.aspx

  34. WDF DDC 2008 Sessions • Ask the Experts Table, • Panel Disccussion

  35. Backup slides

  36. Testing the DDI : Design Approach • Black box testing approach to ensure all DDIs behave as per the specification. • Outline of our approach • Exercise boundary values • Error guessing (guessing at effects on internal state of framework) • Valid test cases (valid combinations of parameters and state) • Invalid test cases (invalid parameters or calling at inappropriate times) • Equivalence partitioning- identify combinations that don’t add test value

  37. UMDF Handling of Invalid DDI Test Cases • When a driver does something illegal, WudfHost.exe breaks into the debugger if one is present. • Want to break into the debugger only when something unexpected occurs. • Test hook to invoke a callback in the driver instead of a debugger break. • Test driver enables the callback before invoking the invalid DDI and then disables it.

  38. Utilizing Import Address Table hooks to keep things running and more… • We have a legacy (NT4 style) driver that modifies a kernel module’s import address table, either when loaded or on-the-fly. • We use this on the KMDF runtime (and in other cases the loader) for • Call logging • Fault injection • Converting bugchecks, breakpoints, asserts, etc. into exceptions. • Test driver uses SEH and knowledge of test to handle or continue. • If continued, we report the bugcheck, breakpoint [if unexpected], or assert. • Allows extensive invalid case testing without having to coordinate two machines or script attached debuggers, etc. • Not a safe approach for product drivers, but acceptable for controlled environment test usages [there are still risks]

  39. IoTarget test suite- a design example • Initial goal was a scenario test to exhaustively cover various aspects of R/W request processing when remote I/O targets are used. • These aspects were analyzed and enumerated so a large cross product could be run to cover all of these cases. • A stress test was to be used as well, so the architecture also needed stress features (such as the ability to remove devices at will mid-test) • Three SW-only drivers (all KMDF, utilizing ActiveX for test programming as in DDI tests) • Bus driver with minimal wake / idle simulated support (COM interface specifies HW ID and bus address to add / remove) • Target (looks like raw memory block device) • Hunter (accepts requests and processes them using a target).

  40. IoTarget Test Suite- Analyzing the test space • Some aspects we wanted to vary had to be set at AddDevice time (e.g. passive synchronization, default queue model, buffered/direct/neither I/O)- these were “fixed attributes”. • Others could be changed on the fly (e.g. complete request at passive or dispatch level, use asynchronous or synchronous sends)- these are “live attributes”. • Fixed attributes map to device bus address so PDO PNP caps query is unambiguous way to know them • Live attributes are set via a single scripted call to hunter or target. • Total combos is (Target Fixed * Target Live) * (Hunter Fixed * Hunter Live) • “Scenario test” runs all combos- 130K+ variations currently

  41. IoTarget Test Suite- adding stress and fault injection • Hunter and target drivers have additional programmable behaviors • Capability to fail calls, change delays, start / stop targets, etc. • Either fixed, or at random via a programmable distribution of each behavior • Hunters can arbitrarily pick their own targets at random • All via the ActiveX interfaces [thus independent of any I/O paths]. • Bus driver (as noted) can add or remove a hunter or target at any time. • Stress app allows selection of behaviors and distributions, issues multi-threaded I/O to hunters, and adds /removes devices at random. • Run in conjunction with driver verifier lo-res for additional stress.

  42. Versioning – Test Matrix (examples) • Minor version update • Install higher minor version of coinstaller • No WDF device was present on System, e.g. Windows XP • Inbox WDF device is running, e.g. Vista. May require reboot • Lower or same minor version • Install v1.7 package on a v1.9 system, and it should still work • Coinstaller is different from the one used by the driver at build time • IHV driver was developed with v1.7, and the INF referenced 1.7 too • A critical fix is fixed in v1.9 • W/o recompile IHV modifies INF to use v1.9 and resubmits for logo

  43. Versioning – Coinstaller Limitation • We will publicly release only one coinstaller per major.minor version • PnP limits that any update to coinstaller must have a different name • I.e., we cannot patch coinstaller once it is released • We can release multiple hotfixes for current mj.mn version • You may have to direct customers to latest hotfix if needed by your driver • Remember: hotfix will not introduce any DDI change (add/remove etc.) • New versions are our fix of record for previous versions

More Related