1 / 19

Software Security Without The Source Code

By Matt Hargett <matt @ use.net>. Software Security Without The Source Code. Introduction. Matt Hargett Security QA Engineer for 7 years NAI/McAfee, TurboLinux, Cenzic Discovered many critical vulnerabilities Created binary code analysis product BugScan. Overview.

eden
Download Presentation

Software Security Without The Source Code

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. By Matt Hargett <matt @ use.net> Software Security Without The Source Code

  2. Introduction • Matt Hargett • Security QA Engineer for 7 years • NAI/McAfee, TurboLinux, Cenzic • Discovered many critical vulnerabilities • Created binary code analysis product • BugScan

  3. Overview • Why we need to measure software security • Kinds of security policy testing • Whitebox approaches • Blackbox approaches • Effectiveness against real-world exploits • What you can do with this information

  4. Why we need to measure • Brand and reputation damage • Proprietary information leakage • Unplanned disruption • Violation of privacy policy • Espionage • Terrorism

  5. How do we measure • Whitebox • Manual code inspection • Source code static analysis • Binary static analysis • Runtime analysis • Blackbox • Web Applications • Network protocols • APIs

  6. Blackbox Network Testing • Sniff, fuzz, replay • Sniff network traffic • Systematically fuzz the relevant data • Remove delimeters, bitwalk, ... • {admin.command\0::\43\89\42} • Replay fuzzed packets to server • Fuzzing via Proxy • Route traffic through a proxy • Proxy fuzzes data systematically • Fuzzed data gets passed on • Repeat client operation • Protocol-specific fuzzing • Make special client for specific protocol(s)

  7. Blackbox Network Testing:In the Real World • Sniff, fuzz, replay • Not stateful • Doesn't work with encryption • Only fuzzes client-side • Fuzzing via Proxy • Lets real client/server handle state • Doesn't work with encryption • Fuzzes client and server data • Protocol-specific fuzzing • Handles real state of client/server • Does encryption itself • Can get great code coverage • Fuzzes client server and data

  8. Blackbox Network Testing:General Jeers • Detecting when you've evoked a problem • Measuring code coverage • Slow process • Expensive to scale

  9. Blackbox Web Testing • Sniff, fuzz, replay • Auto-crawl or manual clicks • Sniff browser requests • Systematically fuzz the relevant data • Insert SQL Injection, Command Injection, XSS, ... • POST /foo.cgi?name=bob&pass=... • Fuzzing via Proxy • Optionally crawl to generate requests • Send requests through a proxy • Proxy fuzzes requests systematically • Repeat browser operation • GUI Automation tools • Automate real browser interaction • Put bad data into form and cookie fields

  10. Blackbox Web Testing:In the Real World • Sniff, fuzz, replay • Server-side state mishaps • Little to no javascript support • No flash, java, web service support • Finds WAY low-hanging fruit • Fuzzing via Proxy • Server-side state mishaps • Must have browser automation anyways • Operations must be self-contained • GUI Automation tools • Maintenance of stored tests • Tests must be self-contained • Can be part of standard QA

  11. Blackbox Web Testing:General Jeers • Detecting when you've evoked a problem • Measuring code coverage • Slow process • Expensive to scale • Getting past captchas

  12. Whitebox Testing Without Source • Manual review • Call-based static analysis • Pointer and control/data flow static analysis

  13. Whitebox Testing Without Source:In the Real World • Manual review • Going through instruction by instruction • Tedious, time consuming, error prone, rare skill • Pair rev-enging, unit testing • Rare skill, unit test exploits may not be real-world • call-based analysis • 42,897 strcpy calls detected!!!!#%* • HttpResponse.GetValue() before Statement.Execute() • Dispose called inside Dispose • Pointer and control/data flow analysis • Must be inter-function to be useful • Must track global/static data • Inter-module tracking also important

  14. Whitebox Testing Without Source:General Jeers • People • Hire manual reviewers with a proven track record of real-world exploitable bugs patched by a vendor • Tools • Difficult to use • Poor quality • False positives

  15. Whitebox Testing With Source:General Jeers • Worthless when source not available • People • Hire manual reviewers with a proven track record of real-world exploitable bugs patched by a vendor • Tools • There is no free unit test lunch • Demand vendors demonstrate finding novel real-world exploitable bugs their tool finds OOTB • Demand vendors demonstrate finding previously known real-world exploitable bugs their tool finds OOTB • Demand third party vendor-neutral benchmarks • No good visualization/exploration tools for manual reviewers

  16. Whitebox Testing:General Jeers, Source or Not • Code is code, period • Most tools are a retarded joke • False positive rates above 10% on large (100KLOC) means a useless tool • High priority report items should be real-world exploitable 95% of the time • Custom signatures shouldn't require extra cost, permission, or license • Tools • Demand vendors demonstrate finding novel real-world exploitable bugs their tool finds OOTB • Demand vendors demonstrate finding previously known real-world exploitable bugs their tool finds OOTB • Demand third party vendor-neutral benchmarks • No good visualization tools for manual reviewers • Zealousy on a given approach

  17. Recommendations • Engage a holistic approach • Blackbox and Whitebox • Use multiple vendor tools to cross-check • Source and binary • Runtime and static • Use protocol-specific fuzzers • Ask vendor for code coverage on open source implementation(s) of said protocol(s) • Use UI automation tools for web apps • Any good tool will require tuning

  18. Taking Action • If open source, fix the problem yourself • Contact vendor • If vendor cannot supply a fix in 30 days • Escalate the issue • Find a new vendor • If open source, fix the problem yourself • Vendors will string you along

  19. Thank You matt @ use . net Questions?

More Related