Automatic evaluation of intrusion detection systems
Download
1 / 13

Automatic Evaluation of Intrusion Detection Systems - PowerPoint PPT Presentation


  • 121 Views
  • Uploaded on

Automatic Evaluation of Intrusion Detection Systems. F. Massicotte, F. Gagnon, Y. Labich, L. Briand, Computer Security Applications Conference, ACSAC ’06, pp 361-370, 2006. Presented by: Lei WEI. Summary.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Automatic Evaluation of Intrusion Detection Systems' - dinos


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Automatic evaluation of intrusion detection systems

Automatic Evaluation of Intrusion Detection Systems

F. Massicotte, F. Gagnon, Y. Labich, L. Briand,

Computer Security Applications Conference,

ACSAC ’06, pp 361-370, 2006.

Presented by: Lei WEI


Summary
Summary

  • Proposed a strategy that is able to evaluate Intrusion Detection System (IDS) automatically and systematically

  • Evaluated two famous IDS programs, Snort 2.3.2 and Bro 0.9a9, by using this new proposed strategy.

  • Proposed a 15-class taxonomy for test results.


Appreciative comments automatization
Appreciative Comments: Automatization

This is an automatic IDS evaluation system. Because of automation, it is possible to efficiently and systematically create a large number of sample data .

  • “ We use 124 VEP (covering a total of 92 vulnerabilities) and 108 different target system configurations” (Automatic Evaluation of Intrusion Detection Systems)

  • “38 different attacks were launched against victim UNIX hosts in seven weeks of training data and two weeks of test data.” (Evaluation Intrusion Detection Systems: The 1998 DARPA Off-line Intrusion Detections Evaluation)


Critical comment 1 complicated classification
Critical Comment: 1. Complicated classification

Each of the collected traffic traces belongs to one of the type, TP, TN, FP and FN.

According to types of all traces collected from IDS evaluation tests, the authors suggested a 15-class taxonomy for IDSes, such as, alarmist, quiet, quietand complete detection, complete evasion etc.

  • This does make the evaluation complicated and confused.

    • Hard to remember all the class names

    • Is quiet and complete detection a subclass of quiet? No!

  • I prefer a statistical way by calculating the following two ratios,

    ( , ) , from which we know the percentage of attack being detected and the percentage about wrong alarms.


Critical comment 2 confused diagrams
Critical Comment: 2. Confused diagrams

In this paper, the two diagrams, Figure 5 and Figure 1, and relevant description used to represent the working process of the whole system are not clear enough.

(a). A title should be “… an effective guide for scientists rapidly scanning lists of titles for information relevant to their interests.” (Scientific writing for graduate students: a manual on the teaching of scientific writing, edited by F. Peter Woodford. New York: Rockefeller University Press, 1968. )

However, neither the title nor the content provides clear explanation to the meaning of numbers in Figure5.


Critical comment 2 confused diagrams continue
Critical Comment: 2. Confused diagrams (Continue)

(b). Although the article describes the steps listed in Figure1, the provided diagram does confused us to understand the structure and working process of the system. The title is Virtual network infrastructure,but the figure actually covers more stuff than that. It does not only represent Virtual network infrastructure, but also shows the working process of the subsystem.


Working process of automatic ids evaluation system
Working process of Automatic IDS Evaluation system

This system could be divided into two subsystems.

  • The attack simulation and data collection system

  • The IDS Evaluation Framework


1 attack simulation and data collection system
1. Attack simulation and data collection system

  • Choose Vulnerability Exploitation Program (VEP)

  • Choose Configuration of the target System (e.g. IDS)

Script Generation

Provide the virtual attacking machine the proper attack configuration (e.g. Whether apply IDS Evasion Tech.)

Set up Virtual Network

Set up Attack Script

  • Capture attack traffic traces

  • Document the traffic traces

Execute Attack

  • Save the traffic traces and IDS alarms on the shared hard-drive

  • Restore the virtual attacker and target machines to their initial state

Data Set

Restore


2 ids evaluation framework
2. IDS Evaluation Framework

IDS Evaluator takes documented traffic traces from the Data Set

The collected IDS alarms are fetched by the IDS Results Analyser

Data Set

IDS Evaluator

IDS Evaluator provide traffic traces to each tested IDS

IDS

Compare the two groups of data sets and determine whether the IDS detection succeed

IDS Result Analyzer

Generate the evaluation report

Report


Question
Question

This paper evaluated two open source IDSes by the new strategy. However, many IDSes have patent or copy right protection. Those creators would never reveal the weak points of their products.

Is it ethical or illegal to publish the evaluations of IDS programs so that others can know the truth?




Document traffic traces supplement
Document traffic traces (Supplement)

Each traffic trace is documented by four characteristics:

  • Target system configuration

  • VEP configuration

  • Whether or not the VEP exploited the vulnerability of the target system

  • Whether or not the attack is successful


ad