1 / 31

Intrusion Detection Systems

Intrusion Detection Systems. By Ali Hushyar. What is an intrusion?. Intrusion: “any action or set of actions that attempt to compromise the integrity, confidentiality or availability of a resource” Heady et al.[Ku95] Intrusion types External penetrations Internal penetrations Misfeasance.

toshi
Download Presentation

Intrusion Detection Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Intrusion Detection Systems By Ali Hushyar

  2. What is an intrusion? • Intrusion: “any action or set of actions that attempt to compromise the integrity, confidentiality or availability of a resource” Heady et al.[Ku95] • Intrusion types • External penetrations • Internal penetrations • Misfeasance

  3. Preventing Intrusion • Authentication • Access Control • Firewalls • Vulnerability Patching • Restricting physical access • Intrusion Detection Systems

  4. Principles • Assumptions about computer systems [D86] • Actions of processes follow specifications describing what the processes are allowed to do • Actions of users and processes have statistically predictable patterns • Actions of users and processes do not have command sequences aimed at compromising system security policies • Exploiting vulnerabilities requires an abnormal use of normal commands or instructions.

  5. Principles • Intrusion detection: determine whether a user has gained or is trying to gain unauthorized access to the system by looking for abnormalities in the system. • IDS Analysis Approaches • Anomaly detection • Distinguish anomalous behavior from normal behavior • Misuse detection • Detect intrusions based on well-known techniques

  6. Static Anomaly Detection • File integrity checkers • Part of system is to remain constant (e.g. system code and data) • Detect anomaly by comparing current system state to original system state • Representation of system state • Actual bit strings • Signatures of bit strings (hash functions) • Meta-data “selection masks” on file or inode fields such as size, access permissions, modification timestamp, access timestamp, user id, group id, etc…

  7. Tripwire

  8. Static Anomaly Detection • Virus checkers • Look for virus signatures in system files or memory • Actual virus bit strings are stored in database • Self-Nonself • Like Tripwire, part of system is static • Like virus checkers, it is necessary to maintain set of unwanted signatures • Human immune system

  9. Static Anomaly Detection • Create Self (example from [F84]) • Represent system state as single static string 00101000100100000100001010010011 • Split string into substrings of size k 0010 1000 1001 0000 0100 0010 1001 0011 • Create Nonself • Generate random substrings of size k 0111 1000 0101 1001 • Censor by comparing substrings to those in Self 0111 0101

  10. Static Anomaly Detection • Size of Nonself affects probability of detecting anomalies and computational load • Probability of detection can be configured • Generating Nonself is expensive but monitoring system is cheaper • Tripwire comparisons • Does not depend on meta-data • Will not detect deletion of files

  11. Dynamic Anomaly Detection • Real world examples (logins, credit-card use) • System behavior defined as sequences of events that are recorded by OS logs and audit records, application logs, network monitors and other probes • Base profiles are created for each entity to be monitored that characterize normal behavior for that entity • Current profiles are built by monitoring system events and deviations from base profile are measured

  12. Statistical Models • Each profile consists of set of measures • Measures depict activity intensity, audit record distribution, categorical, and ordinal measures • Measures can be seen as random variables • Profiles do evolve over time so aging of measures or changing statistical rules take this into consideration

  13. Statistical Models • Operational/Threshold Model • Measure is deemed abnormal if it surpasses fixed limits imposed on the measure • Mean and Standard Deviation Model • Mean and standard deviation of previous n values are known. A confidence value for the new measure can be determined. • Multivariate Model • Better conclusions can be made by taking into consideration correlations of related measures.

  14. Statistical Models • Clustering Model is an example of a nonparametric statistical technique • Data is grouped into clusters • Example from [B03]

  15. Statistical Models • Combining individual measurement values to determine overall abnormality value for the current profile • Let Si be the recorded values of each measure Mi. Then combining function [KU95] can be weighted sum of squares:

  16. Statistical Models • If individual measures Mi are not mutually independent then more complex combining functions will be needed Bayesian Statistics • Ai is 0 or 1 depending on whether Mi normal or anomalous respectively [KU95]

  17. Models based on Sequences of Events • Markov Process Model • Given the present state, past states of a system have no influence on future states • Next state relies only on present state • Non-deterministic systems mean that there are transition probabilities for each state • Given an initial state, an event that transitions system to a state of low probability is taken to be anomalous

  18. Time-based Inductive Learning • Sequence of events: abcdedeabcabc • Predict the events: R1: ab  c (1) R2: c  d (0.5) R3: c  a (0.5) R4: d  e (1) R5: e  a (0.5) R6: e  d (0.5) • Single out rules that are good indicators of behavior: R1 and R4

  19. UNM Pattern Matching • System behavior defined as sequence of OS routine calls • Entities monitored consist of those processes that run with elevated privileges • Profile consists of legitimate traces which are sequences of OS calls of length k

  20. UNM Pattern Matching • Example from [J00] open read write open mmap write fchmod close • Profile traces with max length 4 open read write open write fchmod close open mmap write fchmod mmap write fchmod close read write open mmap fchmod close write open mmap write close • Later sequence of calls recorded open read read open mmap write fchmod close

  21. Neural Networks • Information processing model based on biological nervous systems like the brain • Different than expert systems in that they have ability to learn • Given a data vector they can either apply what they have learned to determine an output or “recognize” similarity between input data vector and other inputs to determine outputs

  22. Neural Networks (http://www.doc.ic.ac.uk)

  23. Neural Network Intrusion Detector • Identify legitimate user on system • Obtain logs indicating how often a user executed a specific command on a system during different time intervals over a period of several days • Each command is a vector of frequencies • 100 commands = 100 dimensional input vector of command vectors • Train the neural net to recognize specific user

  24. Misuse Detection • Anomaly detectors can be trained not to detect intrusive behavior and often vulnerabilities exploited by known attacks are not patched. • Detecting intrusions based on known techniques or sequences of actions • Intrusion scenario or signature must be formally defined

  25. Rule-based Misuse Systems • Intrusion scenarios are defined as a set of rules • System maintains rule base of intrusion scenarios and fact base of event sequences from audit logs • When fact pattern matches antecedent of rule then a rule binding is established and rest of rule is evaluated

  26. Rule-based Misuse Systems • MIDAS rule example [J00] (defrule illegal_privileged_account states if there exists a failed_login_item such that name is (“root”) and time is ?time_stamp and channel is ?channel then (print “Alert: Attempted login to root”) and remember a breakin_attempt with certainty *high* such that attack_time is ?time_stamp and login_channel is ?channel)

  27. State-based Misuse Detection • Intrusion scenarios are modeled as a number of different states and the transitions between them • Actions of would-be intruders lead to compromised state • Two subclasses: state transition and Petri net • State transition • States form a simple chain traversed from beginning to end • Table for each possible intrusion in progress • For each event processed, if event causes transition then row with next state is added to table • Event that causes a transition to a final state indicates intrusion

  28. Petri Networks • Intrusion states form a Petri net that follow a more general tree structure • Many branches may exist denoting initial states of the intrusion • Unix version 7 mkdir command [B03] mknod(“xxx”, directory) chown(“xxx”, user, group)

  29. Petri Networks mknod(“xxx”, directory) chown(“xxx”, user, group) this[uid] == 0 && File1==true_name(this[obj]) mknod S4 S5 F S6 unlink link chown this[uid] == 0 && File2 == this[obj] S1 S2 S3 this[uid] != 0 && File1 == this[obj] true_name(this[obj]) == true_name(“/etc/passwd”) && File2 = this[obj]

  30. Other Misuse Techniques • Simple string matching (KMP) • Protocol Analysis • Detect attack signatures by taking advantage of structure of network data packets. • Identifying packets by protocol and thus interpreting payload data • Fragmented packets can be reassembled before intrusion analysis

  31. References • [B03] Bishop, M. (2003). Computer Security: Art and Science. • [Kr03]Krishna, S. (2003). Intrusion Detection Techniques: Pattern Matching and Protocol Analysis. • [J00]Jones, A. (2000). Computer System Intrusion Detection: A Survey. • [Ku95]Kumar, S. (1995). Classification and Detection of Computer Intrusions. • [F94]Forrest, S. (1994). Self-Nonself Discrimination in a Computer. • [D86] Denning, D. (1986). An Intrusion Detection Model.

More Related