1 / 13

Automatic Evaluation of Intrusion Detection Systems

Automatic Evaluation of Intrusion Detection Systems. F. Massicotte, F. Gagnon, Y. Labich, L. Briand, Computer Security Applications Conference, ACSAC ’06, pp 361-370, 2006. Presented by: Lei WEI. Summary.

gnapier
Download Presentation

Automatic Evaluation of Intrusion Detection Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Automatic Evaluation of Intrusion Detection Systems F. Massicotte, F. Gagnon, Y. Labich, L. Briand, Computer Security Applications Conference, ACSAC ’06, pp 361-370, 2006. Presented by: Lei WEI

  2. Summary • Proposed a strategy that is able to evaluate Intrusion Detection System (IDS) automatically and systematically • Evaluated two famous IDS programs, Snort 2.3.2 and Bro 0.9a9, by using this new proposed strategy. • Proposed a 15-class taxonomy for test results.

  3. Appreciative Comments: Automatization This is an automatic IDS evaluation system. Because of automation, it is possible to efficiently and systematically create a large number of sample data . • “ We use 124 VEP (covering a total of 92 vulnerabilities) and 108 different target system configurations” (Automatic Evaluation of Intrusion Detection Systems) • “38 different attacks were launched against victim UNIX hosts in seven weeks of training data and two weeks of test data.” (Evaluation Intrusion Detection Systems: The 1998 DARPA Off-line Intrusion Detections Evaluation)

  4. Critical Comment: 1. Complicated classification Each of the collected traffic traces belongs to one of the type, TP, TN, FP and FN. According to types of all traces collected from IDS evaluation tests, the authors suggested a 15-class taxonomy for IDSes, such as, alarmist, quiet, quietand complete detection, complete evasion etc. • This does make the evaluation complicated and confused. • Hard to remember all the class names • Is quiet and complete detection a subclass of quiet? No! • I prefer a statistical way by calculating the following two ratios, ( , ) , from which we know the percentage of attack being detected and the percentage about wrong alarms.

  5. Critical Comment: 2. Confused diagrams In this paper, the two diagrams, Figure 5 and Figure 1, and relevant description used to represent the working process of the whole system are not clear enough. (a). A title should be “… an effective guide for scientists rapidly scanning lists of titles for information relevant to their interests.” (Scientific writing for graduate students: a manual on the teaching of scientific writing, edited by F. Peter Woodford. New York: Rockefeller University Press, 1968. ) However, neither the title nor the content provides clear explanation to the meaning of numbers in Figure5.

  6. Critical Comment: 2. Confused diagrams (Continue) (b). Although the article describes the steps listed in Figure1, the provided diagram does confused us to understand the structure and working process of the system. The title is Virtual network infrastructure,but the figure actually covers more stuff than that. It does not only represent Virtual network infrastructure, but also shows the working process of the subsystem.

  7. Working process of Automatic IDS Evaluation system This system could be divided into two subsystems. • The attack simulation and data collection system • The IDS Evaluation Framework

  8. 1. Attack simulation and data collection system • Choose Vulnerability Exploitation Program (VEP) • Choose Configuration of the target System (e.g. IDS) Script Generation Provide the virtual attacking machine the proper attack configuration (e.g. Whether apply IDS Evasion Tech.) Set up Virtual Network Set up Attack Script • Capture attack traffic traces • Document the traffic traces Execute Attack • Save the traffic traces and IDS alarms on the shared hard-drive • Restore the virtual attacker and target machines to their initial state Data Set Restore

  9. 2. IDS Evaluation Framework IDS Evaluator takes documented traffic traces from the Data Set The collected IDS alarms are fetched by the IDS Results Analyser Data Set IDS Evaluator IDS Evaluator provide traffic traces to each tested IDS IDS Compare the two groups of data sets and determine whether the IDS detection succeed IDS Result Analyzer Generate the evaluation report Report

  10. Question This paper evaluated two open source IDSes by the new strategy. However, many IDSes have patent or copy right protection. Those creators would never reveal the weak points of their products. Is it ethical or illegal to publish the evaluations of IDS programs so that others can know the truth?

  11. The End

  12. The 15-class taxonomy (Supplement)

  13. Document traffic traces (Supplement) Each traffic trace is documented by four characteristics: • Target system configuration • VEP configuration • Whether or not the VEP exploited the vulnerability of the target system • Whether or not the attack is successful

More Related