Marmagna Desai [ 592 Presentation]. Survey – IDS Testing. Contents. Introduction Paper I – A methodology for Testing IDS Paper II- Intrusion Detection Testing and Benchmarking Methodology Summary – Paper I Summary – Paper II Conclusion Reference. Introduction .
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Paper I – A methodology for Testing IDS
Paper II- Intrusion Detection Testing and Benchmarking Methodology
Summary – Paper I
Summary – Paper II
IDS development and The PROBLEMS.
Realistic Traffic Generation
Need for Generalized Testing Methodology.
Paper I –Individual attempt to solve above Problems.
Paper II – A commentry on such past attempts and future need for development.
This Survey summarized both papers with conclusive remarks.
One of the many early attempts made in 90's 
Can be viewed as One Methodology for testing Network based IDS.
Based on Software Engineering Test concepts.
Identifies set of general IDS performance Objectives.
UNIX tool: Expect used and enhanced for traffic generation
Experimental IDS: NSM(Network Security Monitor)
Commentary on major attempts to design Evaluation Environment for ID Testing.
Existing Tools and Methodologies.
DARPA and LARIAT [Environments]
TCPReplay, IDSWakeup, WebAvalanche, HPING2 etc. [Tools]
Issues in developing such environment
Database for attacks
Testing limited by case-by-case scenarios.
High Costs and Security problems.
Examples of Evaluation Environments
Environment based on DARPA
Custom Software [ Reference: Paper I ]
Vendor Independent LAB
Comments on the shortcomings on all such attempts and proposes a need for very general approach to build such environment.
Custom Software approach to build evaluation environment – w.r.t. Paper II
One test-bed for one set of related attacks.
IDS affected by system conditions – Stress.
NOT general environment – w.r.t. IDS performance Objectives.
Simulation of User-Behaviours
Software Engineering approach.
Unix tool EXPECT:
Simulation of “normal” and “intruder” behaviour.
Extends TCL interpreter to provide simulation scripts.
Authors have extended the Expect for to include:
Synchronized and Communicative scripts
Interleaving of execution commands by users.
IDS Objectives – Necessary but not sufficient.
Broad Detection Range
Economy in Resource Usage
Resilience to Stress
Test – Case Selection
Based on “equivalence partitioning” of set of intrusions. [Software Engg approach]
Based on Taxonomy of Vulnerabilities – IDS might or might not detect intrusions within class.
Based on Signatures – Very small classes.
Ideal test case:
Combine all three approaches to meet the need of particular site on which IDS is employed!!
Create and select test scripts [normal/intrusion scripts]
Establish desired conditions – perf. Objectives.
Run Test Scripts
Analyse the IDS's output
Intrusion Identification – Basic IDS test
Resource Usage – how much resources used by IDS.
Load – Testing IDS as low CPU priority task.[nice]
Intensity- Lot of activities generated in short time.
Always created by “NORMAL” users.
e.g. Telnet Sessions associated with IDS host.
Scripts can not simulate users in GUI environment.
Designed to test systems that perform “misuse detection” - Anomaly detection is not considered.
Not generalized for all possible attacks [??]
Limited in Performance Objectives
Replaying can be more Realistic
Government undertaking – private and secure
Generate background traffic interlaced with intrusions.
Traffic can be generated by...
Collect real data and attack actual org.
Sanitize data and introduce attack in data itself
Synthesize non-sensitive traffic from scratch
This approach had many shortcomings..
No effort to detect false positives.
Data rates and variation with time never considered. [stress]
Attacks were evenly distributed.
Size of training data may be insufficient.
Yet, DARPA was major effort to build such generalized Evaluation Environment for IDS testing.
Emulates the Network Traffic from a small organization connected to Internet.
This was another attempt to build evaluation methodology.
High Throughput capabilities.
Various attack scenarios
Windows Traffic in to account.
More Realistic and fully Automated
TCPReplay: Provides background traffic by replaying pre-recorded traffic from network links.
IDSWakeup: Generates false attacks, in order to determine if IDS produces alerts.
WebAvalanche: Stress-Testing appliance for web applications and servers.
HPING2: Command line packet assembler and analyser.
Fragrouter: Routes network traffic such that it elude most NIDS.
Background Traffic: contains non-malicious data.
Attack traffic: actual testing data for IDSs.
Attacks intensity can vary in real-time
Databases need to be maintained and updated.
Effects of networking elements – Security Issue
Firewalls, proxy server, ACLs etc.
DARPA – Environment
Attack injection programs used to place attacks.
Traffic generation was similar to early effort.
Victim computer was anonymous FTP server.
Environment focused on DOS attack.
Same as Paper I approach.
Vendor Independent Testing Lab.
Created by NSS group
Build specialized lab to perform attacks on IDS
Provides reports conversing large range of attacks.
Focuses on user-interface, forensics and log management.
Evaluation Environment – NOT just a Tool.
No single methodology for testing IDS for every Attack.
The BEST way: Evaluate IDS using live or recorded real – site specific traffic.
DARPA experiment was significant
Provides realistic evaluation environment
Require lot of rework and not generalized.
Development of IDS testing Methodology is in process.
General, open-source and realistic Evaluation Environment is needed – NOT just a tool.
Unless general methodology developed, IDS design and implementation will face problems..
False positive and Misses
Failure in Stress Conditions.
IDS – Only a Part of Security!!