1 / 6

Evaluation of DART

Evaluation of DART. DART: Directed Automated Random Testing Godefroid, Klarlund, Sen. Experimental Goals. Efficiency of DART directed search approach vs purely random search AC-controller program Needham-Schroeder Protocol Effectiveness with a large program

kmulligan
Download Presentation

Evaluation of DART

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation of DART DART: Directed Automated Random Testing Godefroid, Klarlund, Sen

  2. Experimental Goals • Efficiency of DART • directed search approach vs purely random search • AC-controller program • Needham-Schroeder Protocol • Effectiveness with a large program • Open-source oSIP library, 30K LOC of C code

  3. Efficiency Experiment • AC-Controller Program: • DART: • Explores all exec paths upto • depth=1 in 6 iterations and less than 1 second • Depth=2, find assertaion violation, 7 iterations, <1 sec • Random: • Does not find assertion violation after hours • Probability to find inputs leading assertion = 2**64 • Gets stuck in input-filtering code

  4. Another Efficiency Point • Needham-Schroeder security protocol program • 406 lines of C code • DART: Took < 26 minutes on 2GHz machine to detect middle man attack • VeriSoft (model checker): Hours to detect

  5. Effectiveness with Large App • oSIP (open-source) 30K LOC, 600 externally visible functions • DART: • Found a way to crach 65% of oSIP functions within 1000 attempts of each function • Most were deferencing a null pointer sent as an argument to a function

  6. Putting this work into Context • Colby, Godefroid, Jagadeesan 1998: automatically make program self-executable and systematically explore all behaviors • Close program is simplified version • Considerable work on test-vector generation with symbolic exec • Imprecise static analysis • Dynamic Test generation • only generate for specific paths • Do not deal with function calls or library funcs

More Related