1 / 37

A Novel Technique for Learning Rare Events

A Novel Technique for Learning Rare Events. ME. Margaret H. Dunham, Yu Meng, Jie Huang CSE Department Southern Methodist University Dallas, Texas 75275 mhd@engr.smu.edu This material is based upon work supported by the National Science Foundation under Grant No. IIS-0208741.

Download Presentation

A Novel Technique for Learning Rare Events

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Novel Technique for Learning Rare Events ME Margaret H. Dunham, Yu Meng, Jie Huang CSE Department Southern Methodist University Dallas, Texas 75275 mhd@engr.smu.edu This material is based upon work supported by the National Science Foundation under Grant No. IIS-0208741

  2. Objectives/Outline Develop modeling techniques which can “learn/forget” past behavior of spatiotemporal events. Apply to prediction of rare events. • Introduction • EMM Overview • EMM Applications to Rare Event Detection • Future Work

  3. Objectives/Outline Develop modeling techniques which can “learn/forget” past behavior of spatiotemporal events. Apply to prediction of rare events. • Introduction • EMM Overview • EMM Applications to Rare Event Detection • Future Work

  4. Spatiotemporal Environment • Events arriving in a stream • Can not look at a snapshot of the data. • At any time, t, we can view the state of the problem at a site as represented by a vector of n numeric values: Vt = <S1t, S2t, ..., Snt> Time

  5. Spatiotemporal Modeling • Example Applications: • Flood Prediction • Rare Event Detection – Network traffic, automobile traffic • Requirements • Capture Time • Capture Space • Dynamic • Scalable • Quasi-Real Time

  6. Technique • Spatiotemporal modeling technique based on Markov models. • However – • Size of MM depends on size of dataset • The required structure of the MM is not known at the model construction time. • As the real world being modeled by the MM changes, so should the structure of the MM. Thus not only should transition probabilities change, but the number of states should be changed to more accurately model the changing world.

  7. MM A first order Markov Chain is a finite or countably infinite sequence of events {E1, E2, … } over discrete time points, where Pij = P(Ej | Ei), and at any time the future behavior of the process is based solely on the current state A Markov Model (MM) is a graph with m vertices or states, S, and directed arcs, A, such that: • S ={N1,N2, …, Nm}, and • A = {Lij | i 1, 2, …, m, j 1, 2, …, m} and Each arc, Lij = <Ni,Nj> is labeled with a transition probability Pij = P(Nj | Ni).

  8. Problem with Markov Chains • The required structure of the MC may not be certain at the model construction time. • As the real world being modeled by the MC changes, so should the structure of the MC. • Not scalable – grows linearly as number of events. • Markov Property • Our solution: • Extensible Markov Model (EMM) • Cluster real world events • Allow Markov chain to grow and shrink dynamically

  9. Objectives/Outline Develop modeling techniques which can “learn/forget” past behavior of spatiotemporal events. Apply to prediction of rare events. • Introduction • EMM Overview • EMM Applications to Rare Event Detection • Future Work

  10. Extensible Markov Model (EMM) • Time Varying Discrete First Order Markov Model • Nodes are clusters of real world states. • Learning continues during application phase. • Learning: • Transition probabilities between nodes • Node labels (centroid/medoid of cluster) • Nodes are added and removed as data arrives

  11. Related Work • Splitting Nodes in HMMs • Create new states by splitting an existing state • M.J. Black and Y. Yacoob,”Recognizing facial expressions in image sequences using local parameterized models of image motion”,Int. Journal of Computer Vision, 25(1), 1997, 23-48. • Dynamic Markov Modeling • States and transitions are cloned • G. V. Cormack, R. N. S. Horspool. “Data compression using dynamic Markov Modeling,”The Computer Journal, Vol. 30, No. 6, 1987. • Augmented Markov Model (AMM) • Creates new states if the input data has never been seen in the model, and transition probabilities are adjusted • Dani Goldberg, Maja J Mataric. “Coordinating mobile robot group behavior using a model of interaction dynamics,” Proceedings, the Third International Conference on Autonomous Agents (agents ’99), Seattle, Washington

  12. EMM vs AMM Our proposed EMM model is similar to AMM, but is more flexible: • EMM continues to learn during the application (prediction, etc.) phase. • The EMM is a generic incremental model whose nodes can have any kind of representatives. • State matching is determined using a clustering technique. • EMM not only allows the creation of new nodes, but deletion (or merging) of existing nodes. This allows the EMM model to “forget” old information which may not be relevant in the future. It also allows the EMM to adapt to any main memory constraints for large scale datasets. • EMM performs one scan of data and therefore is suitable for online data processing.

  13. EMM Extensible Markov Model (EMM): at any time t, EMM consists of an MM and algorithms to modify it, where algorithms include: • EMMSim, which defines a technique for matching between input data at time t + 1 and existing states in the MM at time t. • EMMBuild algorithm, which updates MM at time t + 1 given the MM at time t and classification measure result at time t + 1. Additional algorithms are used to modify the model or for applications.

  14. Input: Vt = <S1, S2, …, Sn>: Observed values at n different locations at time t. G: EMM with m states at time t-1. Nc:Current state at time t-1. Output: G: EMM graph at time t. Nc:Current state at time t. if G = empty then // Initialize G, first input vector is the first state N1 = Vt; CN1 = 0; Nc = N1; else // update G as new input comes in foreach Ni in G determine EMMSim(Vt, Ni); let Nn be node with largest similarity value, sim; if sim >= threshold then // update matching state information CNc = CNc + 1; if Lcn exists CLcn = CLcn + 1; else create new transition Lcn = <Nc,Nn>; CLcn = 1; Nc = Nn; else // create a new state Nm+1 represented by Vt create new node Nm+1; Nm+1 = Vt; CNm+1 = 0; create new transition Lc(m+1) = <Nc, Nm+1>; CLc(m+1) = 1; CNc = CNc + 1; Nc = Nm+1; EMMBuild

  15. EMMSim • Find closest node to incoming event. • If none “close” create new node • Labeling of cluster is centroid/medoid of members in cluster • Problem • O(n) • BIRCH O(lg n) • Requires second phase to recluster initial

  16. 2/3 1/2 N3 2/3 N1 2/3 1/2 N3 1/3 1/1 N2 N1 N1 1/2 2/3 1/3 1/1 N2 1/3 N2 N1 1/3 N2 N3 1/1 1 N1 1/1 2/2 1/1 N1 EMMBuild <18,10,3,3,1,0,0> <17,10,2,3,1,0,0> <16,9,2,3,1,0,0> <14,8,2,3,1,0,0> <14,8,2,3,0,0,0> <18,10,3,3,1,1,0.>

  17. 1/3 1/3 1/3 1/6 1/6 N1 N1 N3 N3 1/3 2/2 1/3 1/6 N2 1/3 1/2 N6 N6 N5 N5 EMMDecrement Delete N2

  18. EMM Advantages • Dynamic • Adaptable • Use of clustering • Learns rare event • Scalable: • Growth of EMM is not linear on size of data. • Hierarchical feature of EMM • Creation/evaluation quasi-real time • Distributed / Hierarchical extensions

  19. Growth of EMM Servent Data

  20. EMM Performance – Growth Rate

  21. EMM Performance – Growth Rate Minnesota Traffic Data

  22. Error Rates • Normalized Absolute Ratio Error (NARE) NARE = • Root Means Square (RMS) RMS =

  23. EMM Performance - Prediction

  24. EMM Water Level Prediction – Ouse Data

  25. Objectives/Outline Develop modeling techniques which can “learn/forget” past behavior of spatiotemporal events. Apply to prediction of rare events. • Introduction • EMM Overview • EMM Applications to Rare Event Detection • Future Work

  26. Rare Event • Rare - Anomalous – Surprising • Out of the ordinary • Not outlier detection • No knowledge of data distribution • Data is not static • Must take temporal and spatial values into account • May be interested in sequence of events • Ex: Snow in upstate New York is not rare • Snow in upstate New York in June is rare • Rare events may change over time

  27. Rare Event Examples • The amount of traffic through a site in a particular time interval as extremely high or low. • The type of traffic (i.e. source IP addresses or destination addresses) is unusual. • Current traffic behavior is unusual based on recent precious traffic behavior. • Unusual behavior at several sites.

  28. What is a Rare Event? • Not an outlier • We don’t know anything about the distribution of the data. Even if we did the data continues changing. A model created based on a static view may not fit tomorrow’s data. • We view a rare event as: • Unusual state of the network (or subset thereof). • Transition between network states which does not frequently occur. • Base rare event detection on determining events or transitions between events that do not frequently occur.

  29. Rare Event Examples – VoIP Traffic • The amount of traffic through a site in a particular time interval as extremely high or low. • The type of traffic (i.e. source IP addresses or destination addresses) is unusual. • Current traffic behavior is unusual based on recent precious traffic behavior. • Unusual behavior at several sites.

  30. Rare Event Detection Applications • Intrusion Detection • Fraud • Flooding • Unusual automobile/network traffic

  31. Rare Event Detection Techniques • Signature Based • Created signatures for normal behavior • Rule based • Pattern Matching • State Transition Analysis • Statistical Based • Profiles of normal behavior • Data Mining Base • Classification • Clustering

  32. EMM Rare Event Prediction – VoIP Traffic • Predict rare events at a specific site (switch) representing an area of the network. • Use: • Identify when rare transition occurs • Identify rare event by creation of new node • Hierarchical EMM: Collect rare event information at a higher level by constructing an EMM of more global events from several sites there.

  33. Our Approach • By learning what is normal, the model can predict what is not • Normal is based on likelihood of occurrence • Use EMM to build model of behavior • We view a rare event as: • Unusual event • Transition between events states which does not frequently occur. • Base rare event detection on determining events or transitions between events that do not frequently occur. • Continue learning

  34. EMMRare • EMMRare algorithm indicates if the current input event is rare. Using a threshold occurrence percentage, the input event is determined to be rare if either of the following occurs: • The frequency of the node at time t+1 is below this threshold • The updated transition probability of the MC transition from node at time t to the node at t+1 is below the threshold

  35. Determining Rare • Occurrence Frequency (OFc) of a node Nc as defined by: OFc = • Likewise when determining what is meant by small for a transition probability, we should look at a normalized rather than actual value. We, thus, define the Normalized Transition Probability (NTPmn), from one state, Nm, to another, Nn, as: NTPmn =

  36. Ongoing/Future Work • Extend to Emerging Patterns • Incorporate techniques to reduce False Alarms • Extend to Hierarchical/Distributed

  37. Conclusion We welcome feedback

More Related