1 / 43

Power Debugging

FT54. Power Debugging. Sandeep Karanth RSDE Advanced Development and Prototyping Microsoft Research India. Kapil Vaswani Researcher Rigorous Software Engineering Microsoft Research India. Sriram Rajamani, Aditya Nori (Rigorous Software Engineering, MSRI)

sage
Download Presentation

Power Debugging

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. FT54 Power Debugging Sandeep Karanth RSDE Advanced Development and Prototyping Microsoft Research India Kapil Vaswani Researcher Rigorous Software Engineering Microsoft Research India Sriram Rajamani, Aditya Nori(Rigorous Software Engineering, MSRI) Joseph Joy, B. Ashok, Gopal Srinivasa(Advanced Development and Prototyping, MSRI) Hongkang Liang, Vipindeep Vangala (Windows Sustained Engineering) Trishul Chilimbi(Runtime Analysis and Design, MSR) Abhik Roychoudhury (National University of Singapore) Ben Liblit(University of Wisconsin)

  2. Debugging is hard! Root cause Symptom of bug

  3. Debugging Yesterday, Today and Tomorrow How else can we help diagnose failures? + + Visual Studio IntellitraceTM Visual Studio Test Elements + Visual Studio Test Impact Analysis

  4. Power Debugging Holmes Statistical debugging tool Use large test suites to diagnose failures Darwin Tool for debugging regressions Use a previous, stable version of an application to diagnose failures Debug Advisor Recommendation system for bugs Mines software repositories for information related to a bug

  5. Holmes Statistical Debugging Kapil Vaswani, Aditya Nori(Rigorous Software Engineering, MSRI) Sandeep Karanth (Advanced Development and Prototyping, MSRI) Trishul Chilimbi (Runtime Analysis and Design, MSR) Ben Liblit (University of Wisconsin)

  6. HolmesWhere testing meets debugging • Programs are often put through rigorous testing • Large test suites • Many passing tests, some failing tests • Can test suites help us find the cause of failures?

  7. Statistical Debugging with Holmes • Collect profiles/coverage data from a large number of (successful and failing) test cases • Look for code paths that strongly correlate with failure

  8. Debugging with Holmes 0 1 11011 . . Potential root causes Test results Pass/fail Test suite Automated/Manual Visual Studio unit tests Visual Studio Test Elements Holmes Statistical analysis Code coverage Holmes path coverage Historical debugging

  9. HolmesVisual Studio Integration demo Sandeep Karanth RSDE Advanced Development and Prototyping

  10. Holmes available for download today! announcing http://research.microsoft.com/holmes Try it and give us feedback!

  11. Debugging with Holmes 0 1 11011 . . Potential root causes Test results Pass/fail Test suite Automated/manual Visual Studio unit tests Visual Studio Test Elements Holmes Statistical analysis Holmes Statistical analysis Code coverage Code coverage

  12. Code coverage for Holmes • Statement/block/arc coverage insufficient • Path coverage • Track acyclic, intra-procedural path fragments • Why paths? • Paths represent scenarios • Bugs often occur in complex scenarios • Profile with low overheads (roughly 10 – 30%) a b c d e f

  13. Statistical Analysis • Measuring correlation straight-forward • But there is a pitfall! • cum hoc ergo propter hoc, a logical fallacy • Correlation does not imply causality • Examples • Exception handling code • Error recovery routines

  14. Cause and correlation • An analysis that distinguishes cause from correlation • Context of the path - method/loop/try-catch block containing the path • Look for paths that strongly correlate with failure • But context does not correlate with failure • Very effective in practice!

  15. Statistical Analysis • Inputs to analysis • Path coverage for each test case • Outcome of each test case • Compute four statistics for each path

  16. Statistical Analysis Context: How much is the context of a path correlated with failure? Increase: How much more is the path correlated with failure? Recall: What fraction of all failures occur when this path is covered? Confidence: Overall measure that combines increase and recall

  17. Giving Holmes a hand • Write more test cases! • Use automated test generation tools like Pex • How many? • Typically, 10-20 related failing tests with ~100 passing tests suffice

  18. Summary • Holmes available on http://research.microsoft.com/holmes • Ships with a tool for measuring path coverage • Integrates with Visual Studio and Test Elements • Supports managed code • Supports automated and manual tests • Ongoing work • Support for unmanaged code • Support for historical debugging traces • Try it and give us feedback!

  19. Darwin Kapil Vaswani (Rigorous Software Engineering, MSRI) Abhik Roychoudhury(National University of Singapore) Automatically Root-causing Regressions

  20. Regressions • Changes that break functionality • Often uncovered by regression testing • Debug by comparing buggy version with previous, correct version • Doesn’t work when too many changes • Doesn’t work for unmasking regressions

  21. Debugging by comparing test cases • Compare trace of the failing test case with a similar, passing test case • Problem – such test cases don’t exist! Root cause

  22. Darwin – Key ideas • Define notions of similarity between test cases • Automatically generate similar, passing test given a failing test case

  23. Test Similarity New version Old version • Given • Two versions of an application P and P’ • A test T that passes on P and fails on P’ • Similarity • A test T’ is similar to T if T’ and T follow the same control flow path in P but different paths in P’ Root cause

  24. Test Generation in Darwin • Problem of finding similar tests as a constraint solving problem • Using techniques similar to Pex

  25. Darwin at work void dodash(char delim, char* src, int* i, char* dest, int* j, intmaxset) { int k; bool junk; char escjunk; while ((src[*i] != delim) && (src[*i] != ENDSTR) { if (src[*i - 1] == ESCAPE) { escjunk = esc(src, i); junk = addsrt(escjunk, dest, j, maxset); } else { … } } • void dodash(char delim, char* src, • int* i, char* dest, int* j, • intmaxset) • { • int k; • bool junk; • char escjunk; • while ((src[*i] != delim) && • (src[*i] != ENDSTR) { • if (src[*i] == ESCAPE) { • escjunk = esc(src, i); • junk = addsrt(escjunk, dest, j, maxset); • } • else { • … • } • } New Old % [0-9][^9-B][@t][^a-c] Failing test case Passing test case generated by Darwin % [0-9][^9-B][00][^a-c]

  26. Current status • Prototype based on Pex • Automatically root caused regressions in large applications • Web servers (HTML pages) • Image processing applications (jpeg images) • Working on VS integration, supporting multi-threading, … • Watch this space! http://research.microsoft.com/darwin

  27. DebugAdvisor Sriram Rajamani(Rigorous Software Engineering, MSRI) Joseph Joy, B. Ashok, Gopal Srinivasa(Advanced Development and Prototyping, MSRI) Hongkang Liang, Vipindeep Vangala (Windows Sustained Engineering) A recommendation system for bugs

  28. A Common Scenario Has this or similar bug been looked at or fixed before? What do we know about this kind of bugs? Tester/developer receives bug report Who should I ask for help? Where should I start looking?

  29. What You Know The customer experiences some deadlocks on a server. The problem is random and may occur from several times a week to once a month. The system looks hung because the global resource 'ObpInitKillMutant' is help by a thread which tries to close a file forever. So all the processes having a thread waiting on 'ObpInitKillMutant' stop working fine. Drivers such as TCP/IP continue to respond normally but it's impossible to connect to any share. 0: kd> !thread 82807020 ChildEBPRetAddrArgs to Child 80c7a028 00000000 00000000 ntkrnlmp!IopAcquireFileObjectLock+0x58 82a6d7a0 80c7a028 00120089 ntkrnlmp!IopCloseFile+0x79 82a6d7a0 80c7a010 80f6da40 ntkrnlmp!ObpDecrementHandleCount+0x112 00000324 7ffdef01 00000000 ntkrnlmp!NtClose+0x170 00000324 7ffdef01 00000000 ntkrnlmp!KiSystemService+0xc9 00000324 80159796 000000c9 ntkrnlmp!ZwClose+0xb 000000c9 e185f648 00000000 ntkrnlmp!ObDestroyHandleProcedure+0xd 809e3008 801388e4 82a6d926 ntkrnlmp!ExDestroyHandleTable+0x48 00000001 82a6d7a0 7ffde000 ntkrnlmp!ObKillProcess+0x44 00000001 82a6d7a0 82a6d7f0 ntkrnlmp!PspExitProcess+0x54 00000000 f0941f04 0012fa70 ntkrnlmp!PspExitThread+0x447 ffffffff 00000000 00002a60 ntkrnlmp!NtTerminateProcess+0x13c ffffffff 00000000 00002a60 ntkrnlmp!KiSystemService+0xc9 00000000 00000000 00000000 NTDLL!NtTerminateProcess+0xb REGISTERS: eax=00000005 ebx=e3185488 ecx=0000083c edx=e2dddc68 Textual description of bug Stack trace Processor state

  30. Debug Advisor Search

  31. Similar Bugs

  32. Related Information

  33. Debug Logs

  34. Query processing Bug repository Similar bugs Query Stack trace Code snippets Emails Feature parsers Stack trace parser Register information parser Link Analysis Debug advisor report Repositories Version control Bug repository Debug logs Relationship builder Relationship graph

  35. Deployment feedback • Deployed internally for over 6 months • Used by several developers

  36. Summary and Status • Much better precision and recall compared to full text search • Effectiveness depends on access to large repositories • Evaluating effectiveness with smaller, more representative repositories • Exploring potential integration with Visual Studio

  37. Summary • Debugging is getting harder! • Debugging tools need to evolve • Actively help diagnose failures • Three tools that assist/automate debugging • Exploit by-products of a typical software lifecycle (tests, versions, repositories) • Holmes available for download, others will follow

  38. Related talks

  39. Resources • Holmes • Download: http://research.microsoft.com/holmes • Forum: http://blogs.msdn.com/holmes • Technical papers • Holmes: Effective Statistical Debugging via Efficient Path Profiling, Trishul Chilimbi,Ben Liblit, Krishna Mehra, Aditya Nori and Kapil Vaswani, ICSE 2009 • Darwin: An Approach for Debugging Evolving Programs, Dawei Qi, Abhik Roychoudhury, Zengkai Liang and Kapil Vaswani, FSE 2009 • DebugAdvisor: A Recommender System for Debugging, B. Ashok, Joseph Joy, Hongkang Liang, Sriram Rajamani, Gopal Srinivasa, and Vipindeep Vangala, FSE 2009

  40. YOUR FEEDBACK IS IMPORTANT TO US! Please fill out session evaluation forms online at MicrosoftPDC.com

  41. Learn More On Channel 9 • Expand your PDC experience through Channel 9 • Explore videos, hands-on labs, sample code and demos through the new Channel 9 training courses channel9.msdn.com/learn Built by Developers for Developers….

More Related