1 / 24

Survey – IDS Testing

Marmagna Desai [ 592 Presentation]. Survey – IDS Testing. Contents. Introduction Paper I – A methodology for Testing IDS Paper II- Intrusion Detection Testing and Benchmarking Methodology Summary – Paper I Summary – Paper II Conclusion Reference. Introduction .

EllenMixel
Download Presentation

Survey – IDS Testing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Marmagna Desai [ 592 Presentation] Survey – IDS Testing

  2. Contents Introduction Paper I – A methodology for Testing IDS Paper II- Intrusion Detection Testing and Benchmarking Methodology Summary – Paper I Summary – Paper II Conclusion Reference

  3. Introduction IDS development and The PROBLEMS. False Positives Misses Realistic Traffic Generation Need for Generalized Testing Methodology. Paper I –Individual attempt to solve above Problems. Paper II – A commentry on such past attempts and future need for development. This Survey summarized both papers with conclusive remarks.

  4. Introduction...A Methodology for Testing IDS One of the many early attempts made in 90's [1996] Can be viewed as One Methodology for testing Network based IDS. Based on Software Engineering Test concepts. Identifies set of general IDS performance Objectives. UNIX tool: Expect used and enhanced for traffic generation Experimental IDS: NSM(Network Security Monitor)

  5. IntroductionID testing and Benchmarking Methodologies Commentary on major attempts to design Evaluation Environment for ID Testing. Existing Tools and Methodologies. DARPA and LARIAT [Environments] TCPReplay, IDSWakeup, WebAvalanche, HPING2 etc. [Tools] Issues in developing such environment Background Traffic Database for attacks Testing limited by case-by-case scenarios. High Costs and Security problems.

  6. Introduction...ID Testing and Benchmarking Methodologies Examples of Evaluation Environments Environment based on DARPA Custom Software [ Reference: Paper I ] Vendor Independent LAB Comments on the shortcomings on all such attempts and proposes a need for very general approach to build such environment.

  7. Summary – Paper I Custom Software approach to build evaluation environment – w.r.t. Paper II Facts: One test-bed for one set of related attacks. IDS affected by system conditions – Stress. NOT general environment – w.r.t. IDS performance Objectives. Simulation of User-Behaviours Software Engineering approach.

  8. Software Platform – Paper I Unix tool EXPECT: Simulation of “normal” and “intruder” behaviour. Extends TCL interpreter to provide simulation scripts. Authors have extended the Expect for to include: Concurrent scripts Synchronized and Communicative scripts Interleaving of execution commands by users. Replaying

  9. Performance Objectives – Paper I IDS Objectives – Necessary but not sufficient. Broad Detection Range Economy in Resource Usage Resilience to Stress Test – Case Selection Based on “equivalence partitioning” of set of intrusions. [Software Engg approach] Based on Taxonomy of Vulnerabilities – IDS might or might not detect intrusions within class. Based on Signatures – Very small classes.

  10. Test-Case Selection Ideal test case: Combine all three approaches to meet the need of particular site on which IDS is employed!!

  11. Testing Methodology - Paper I General Methodology: Create and select test scripts [normal/intrusion scripts] Establish desired conditions – perf. Objectives. Start IDS Run Test Scripts Analyse the IDS's output

  12. Testing Methodology... (PI) Conditions Intrusion Identification – Basic IDS test Resource Usage – how much resources used by IDS. Stress Load – Testing IDS as low CPU priority task.[nice] Intensity- Lot of activities generated in short time. Background Noise Always created by “NORMAL” users. e.g. Telnet Sessions associated with IDS host.

  13. Limitations – Paper I Scripts can not simulate users in GUI environment. Designed to test systems that perform “misuse detection” - Anomaly detection is not considered. Not generalized for all possible attacks [??] Limited in Performance Objectives Replaying can be more Realistic

  14. Summary – Paper II DARPA approach Government undertaking – private and secure Generate background traffic interlaced with intrusions. Traffic can be generated by... Collect real data and attack actual org. Sanitize data and introduce attack in data itself Synthesize non-sensitive traffic from scratch

  15. DARPA ... This approach had many shortcomings.. No effort to detect false positives. Data rates and variation with time never considered. [stress] Attacks were evenly distributed. Size of training data may be insufficient. Yet, DARPA was major effort to build such generalized Evaluation Environment for IDS testing.

  16. LARIATLincoln Adaptable Real-Time Information Assurance Test-Bed Emulates the Network Traffic from a small organization connected to Internet. This was another attempt to build evaluation methodology. Features: High Throughput capabilities. Various attack scenarios Windows Traffic in to account. More Realistic and fully Automated

  17. Tools TCPReplay: Provides background traffic by replaying pre-recorded traffic from network links. IDSWakeup: Generates false attacks, in order to determine if IDS produces alerts. WebAvalanche: Stress-Testing appliance for web applications and servers. HPING2: Command line packet assembler and analyser. Fragrouter: Routes network traffic such that it elude most NIDS.

  18. Issues Traffic generation Background Traffic: contains non-malicious data. Attack traffic: actual testing data for IDSs. Databases Attacks intensity can vary in real-time Databases need to be maintained and updated. High cost Effects of networking elements – Security Issue Firewalls, proxy server, ACLs etc.

  19. Present Evaluation Environments DARPA – Environment Attack injection programs used to place attacks. Traffic generation was similar to early effort. Victim computer was anonymous FTP server. Environment focused on DOS attack.

  20. Environments.... Custom Software.. Same as Paper I approach. Vendor Independent Testing Lab. Created by NSS group Build specialized lab to perform attacks on IDS Provides reports conversing large range of attacks. Focuses on user-interface, forensics and log management.

  21. Conclusion Evaluation Environment – NOT just a Tool. No single methodology for testing IDS for every Attack. The BEST way: Evaluate IDS using live or recorded real – site specific traffic. DARPA experiment was significant Provides realistic evaluation environment Require lot of rework and not generalized.

  22. Survey Comments Development of IDS testing Methodology is in process. General, open-source and realistic Evaluation Environment is needed – NOT just a tool. Unless general methodology developed, IDS design and implementation will face problems.. False positive and Misses Failure in Stress Conditions. IDS – Only a Part of Security!!

  23. References • Pieta, Nicholas J.; Chung, Mandy;, Olsson, Ronald A and Mukherjee, Biswanath. “A methodology for testing Intrusion Detection Systems”, IEEE Transactions on Software Engineering, 22, 1996, ppl. 719-720. • Athanasiades, Nicholas;Abler, Randal;Levine, John; Owen, Henry;Riley, George. “Intrusion Detection Testing and Benchmarking Methodologies”, IEEE International Information Assurance Workshop, 2003

  24. Thank You!! Questions ?

More Related