1 / 47

Spatiotemporal Stream Mining Using EMM

Spatiotemporal Stream Mining Using EMM. Margaret H. Dunham Southern Methodist University Dallas, Texas 75275 mhd@engr.smu.edu This material is based in part upon work supported by the National Science Foundation under Grant No. 9820841. Completely Data Driven Model. WARNING.

nash
Download Presentation

Spatiotemporal Stream Mining Using EMM

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Spatiotemporal Stream Mining Using EMM Margaret H. Dunham Southern Methodist University Dallas, Texas 75275 mhd@engr.smu.edu This material is based in part upon work supported by the National Science Foundation under Grant No. 9820841

  2. Completely Data Driven Model WARNING • No assumptions about data • We only know the general format of the data • THE DATA WILL TELL US WHAT THE MODEL SHOULD LOOK LIKE!

  3. Motivation 4/24/09 - KSU • A growing number of applications generate streams of data. • Computer network monitoring data • Call detail records in telecommunications (Cisco VoIP 2003) • Highway transportation traffic data (MnDot 2005) • Online web purchase log records (JCPenney 2003, Travelociy 2005) • Sensor network data (Ouse, Derwent 2002) • Stock exchange, transactions in retail chains, ATM operations in banks, credit card transactions.

  4. 2/3 1/2 N3 2/3 N1 2/3 1/2 N3 1/3 1/1 N2 N1 N1 1/2 2/3 1/3 1/1 N2 1/3 N2 N1 1/3 N2 N3 1/1 1 N1 1/1 2/2 1/1 N1 EMM Build <18,10,3,3,1,0,0> <17,10,2,3,1,0,0> <16,9,2,3,1,0,0> <14,8,2,3,1,0,0> <14,8,2,3,0,0,0> <18,10,3,3,1,1,0.> …

  5. Spatiotemporal Stream Mining Using EMM Spatiotemporal Stream Data EMM vs MM vs other dynamic MM techniques EMM Overview EMM Applications

  6. Spatiotemporal Environment • Observations arriving in a stream • At any time, t, we can view the state of the problem as represented by a vector of n numeric values: Vt = <S1t, S2t, ..., Snt> Time

  7. Data Stream Modeling Requirements • Single pass: Each record is examined at most once • Bounded storage: Limited Memory for storing synopsis • Real-time: Per record processing time must be low • Summarization (Synopsis )of data • Use data NOT SAMPLE • Temporal and Spatial • Dynamic • Continuous (infinite stream) • Learn • Forget • Sublinear growth rate - Clustering 7

  8. MM A first order Markov Chain is a finite or countably infinite sequence of events {E1, E2, … } over discrete time points, where Pij = P(Ej | Ei), and at any time the future behavior of the process is based solely on the current state A Markov Model (MM) is a graph with m vertices or states, S, and directed arcs, A, such that: • S ={N1,N2, …, Nm}, and • A = {Lij | i 1, 2, …, m, j 1, 2, …, m} and Each arc, Lij = <Ni,Nj> is labeled with a transition probability Pij = P(Nj | Ni).

  9. Problem with Markov Chains • The required structure of the MC may not be certain at the model construction time. • As the real world being modeled by the MC changes, so should the structure of the MC. • Not scalable – grows linearly as number of events. • Our solution: • Extensible Markov Model (EMM) • Cluster real world events • Allow Markov chain to grow and shrink dynamically

  10. Extensible Markov Model (EMM) • Time Varying Discrete First Order Markov Model • Nodes (Vertices) are clusters of real world observations. • Learning continues during application phase. • Learning: • Transition probabilities between nodes • Node labels (centroid/medoid of cluster) • Nodes are added and removed as data arrives

  11. Related Work • Splitting Nodes in HMMs • Create new states by splitting an existing state • M.J. Black and Y. Yacoob,”Recognizing facial expressions in image sequences using local parameterized models of image motion”,Int. Journal of Computer Vision, 25(1), 1997, 23-48. • Dynamic Markov Modeling • States and transitions are cloned • G. V. Cormack, R. N. S. Horspool. “Data compression using dynamic Markov Modeling,”The Computer Journal, Vol. 30, No. 6, 1987. • Augmented Markov Model (AMM) • Creates new states if the input data has never been seen in the model, and transition probabilities are adjusted • Dani Goldberg, Maja J Mataric. “Coordinating mobile robot group behavior using a model of interaction dynamics,” Proceedings, the Third International Conference on Autonomous Agents (agents ’99), Seattle, Washington

  12. EMM vs AMM Our proposed EMM model is similar to AMM, but is more flexible: • EMM continues to learn during the application phase. • The EMM is a generic incremental model whose nodes can have any kind of representatives. • State matching is determined using a clustering technique. • EMM not only allows the creation of new nodes, but deletion (or merging) of existing nodes. This allows the EMM model to “forget” old information which may not be relevant in the future. It also allows the EMM to adapt to any main memory constraints for large scale datasets. • EMM performs one scan of data and therefore is suitable for online data processing.

  13. EMM Operations • Input: EMM • Output: EMM’ • EMM Build – Modify/add nodes/arcs based on input observations • EMM Prune – Removes nodes/arcs • EMM Merge – Combine multiple EMM nodes • EMM Split – Split a node into multiple nodes • EMM Age – Modify relative weights of old versus new oberservations • EMM Combine – Merge multiple EMMS by merging specific states and transitions.

  14. Example from rEMM (R Package Available) Courtesy Mike Hahsler

  15. 1/3 1/3 1/3 1/6 1/6 N1 N1 N3 N3 1/3 2/2 1/3 1/6 N2 1/3 1/2 N6 N6 N5 N5 EMM Prune Delete N2

  16. Artificial Data −0.2 0.0 0.2 0.4 0.6 0.8 1.0 x

  17. EMM Advantages • Dynamic • Adaptable • Use of clustering • Learns rare event • Sublinear Growth Rate • Creation/evaluation quasi-real time • Distributed / Hierarchical extensions • Overlap Learning and Testing

  18. EMM Applications Predict – Forecast future state values. Evaluate (Score) – Assess degree of model compliance. Find the probability that a new observation belongs to the same class of data modeled by the given EMM. Analyze – Report model characteristics concerning EMM. Visualize – Draw graph Probe – Report specific detailed information about a state (if available)

  19. EMM Results • Predicting Flooding Ouse and Derwent – River flow data from England http://www.nercwallingford.ac.uk/ih/nrfa/index.html • Rare Event Detection VoIP Traffic Data obtained at Cisco Systems Minnesota Traffic Data • Classification DNA/RNA Sequence Analysis

  20. Derwent River (UK)

  21. Sublinear Growth Rate

  22. Prediction Error Rates • Normalized Absolute Ratio Error (NARE) NARE = • Root Means Square (RMS) RMS =

  23. EMM Performance – Prediction (Ouse)

  24. EMM Water Level Prediction – Ouse Data

  25. Rare Event • Rare - Anomalous – Surprising • Out of the ordinary • Not outlier detection • No knowledge of data distribution • Data is not static • Must take temporal and spatial values into account • May be interested in sequence of events • Ex: Snow in upstate New York is not rare • Snow in upstate New York in June is rare • Rare events may change over time

  26. Rare Event Examples • The amount of traffic through a site in a particular time interval as extremely high or low. • The type of traffic (i.e. source IP addresses or destination addresses) is unusual. • Current traffic behavior is unusual based on recent precious traffic behavior. • Unusual behavior at several sites.

  27. Rare Event Detection Applications • Intrusion Detection • Fraud • Flooding • Unusual automobile/network traffic

  28. Our Approach • By learning what is normal, the model can predict what is not • Normal is based on likelihood of occurrence • Use EMM to build model of behavior • We view a rare event as: • Unusual event • Transition between events states which does not frequently occur. • Base rare event detection on determining events or transitions between events that do not frequently occur. • Continue learning

  29. EMMRare • EMMRare algorithm indicates if the current input event is rare. Using a threshold occurrence percentage, the input event is determined to be rare if either of the following occurs: • The frequency of the node at time t+1 is below this threshold • The updated transition probability of the MC transition from node at time t to the node at t+1 is below the threshold

  30. Determining Rare • Occurrence Frequency (OFc) of a node Nc : OFc = • Normalized Transition Probability (NTPmn), from one state, Nm, to another, Nn : NTPmn =

  31. EMMRare • Given: • Rule#1: CNi <= thCN • Rule#2: CLij <= thCL • Rule#3: OFc <= thOF • Rule#4: NTPmn <= thNTP • Input: Gt: EMM at time t • i: Current state at time t • R= {R1, R2,…,RN}: A set of rules • Output: At: Boolean alarm at time t • Algorithm: • At = • 1 Ri = True • 0 Ri = False

  32. VoIP Traffic Data 4/24/09 - KSU

  33. Rare Event in Cisco Data

  34. Temporal Heat Map • Also called Temporal Chaos Game Representation (TCGR) • Temporal Heat Map (THM) is a visualization technique for streaming data derived from multiple sensors. • It is a two dimensional structure similar to an infinite table. • Each row of the table is associated with one sensor value. • Each column of the table is associated with a point in time. • Each cell within the THM is a color representation of the sensor value • Colors normalized (in our examples) • 0 – While • 0.5 – Blue • 1.0 - Red

  35. Cisco – Internal VoIP Traffic Data • Values → • Complete Stream: CiscoEMM.png • VoIP traffic data was provided by Cisco Systems and represents logged VoIP traffic in their Richardson, Texas facility from Mon Sep 22 12:17:32 2003 to Mon Nov 17 11:29:11 2003. • Time →

  36. Rare Event Detection Detected unusual weekend traffic pattern Weekdays Weekend Minnesota DOT Traffic Data

  37. acgtgcacgtaactgattccggaaccaaatgtgcccacgtcga Moving Window TCGR Example A C G T Pos 0-8 2 3 3 1 Pos 1-9 1 3 3 2 … Pos 34-42 2 4 2 1 A C G T Pos 0-8 0.4 0.6 0.6 0.2 Pos 1-9 0.2 0.6 0.6 0.4 … Pos 34-42 0.4 0.8 0.4 0.2

  38. TCGR Example (cont’d) • TCGRs for Sub-patterns of length 1, 2, and 3

  39. TCGR Example (cont’d) A C G T acgtgcacg cgtgcacgt tccggaacc ccggaacca ccacgtcga Window 0: Pos 0-8 Window 1: Pos 1-9 Window 17: Pos 17-25 Window 18: Pos 18-26 Window 34: Pos 34-42

  40. C. elegans Homo sapiens Mus musculus All Mature ACG CGC GCG UCG TCGR – Mature miRNA(Window=5; Pattern=3)

  41. Research Approach • Represent potential miRNA sequence with TCGR sequence of count vectors • Create EMM using count vectors for known miRNA (miRNA stem loops, miRNA targets) • Predict unknown sequence to be miRNA (miRNA stem loop, miRNA target) based on normalized product of transition probabilities along clustering path in EMM

  42. Related Work 1 • Predicted occurrence of pre-miRNA segments form a set of hairpin sequences • No assumptions about biological function or conservation across species. • Used SVMs to differentiate the structure of hiarpin segments that contained pre-miRNAs from those that did not. • Sensitivey of 93.3% • Specificity of 88.1% 1 C. Xue, F. Li, T. He, G. Liu, Y. Li, nad X. Zhang, “Classification of Real and Pseudo MicroRNA Precursors using Local Structure-Sequence Features and Support Vector Machine,” BMC Bioinformatics, vol 6, no 310.

  43. Preliminary Test Data1 • Positive Training: This dataset consists of 163 human pre-miRNAs with lengths of 62-119. • Negative Training: This dataset was obtained from protein coding regions of human RefSeq genes. As these are from coding regions it is likely that there are no true pre-miRNAs in this data. This dataset contains 168 sequences with lengths between 63 and 110 characters. • Positive Test: This dataset contains 30 pre-miRNAs. • Negative Test: This dataset contains 1000 randomly chosen sequences from coding regions. 1 C. Xue, F. Li, T. He, G. Liu, Y. Li, nad X. Zhang, “Classification of Real and Pseudo MicroRNA Precursors using Local Structure-Sequence Features and Support Vector Machine,” BMC Bioinformatics, vol 6, no 310.

  44. TCGRs for Xue Training Data

  45. TCGRs for Xue Test Data

  46. Thanks!

  47. References Margaret H. Dunham, Nathaniel Ayewah, Zhigang Li, Kathryn Bean, and Jie Huang, “Spatiotemporal Prediction Using Data Mining Tools,” Chapter XI in Spatial Databases: Technologies, Techniques and Trends, Yannis Manolopouos, Apostolos N. Papadopoulos and Michael Gr. Vassilakopoulos, Editors, 2005, Idea Group Publishing, pp 251-271. Margaret H. Dunham, Yu Meng, and, Jie Huang, “Extensible Markov Model,” Proceedings IEEE ICDM Conference, November 2004, pp 371-374. Yu Meng, Margaret Dunham, Marco Marchetti, and Jie Huang, ”Rare Event Detection in a Spatiotemporal Environment,” Proceedings of the IEEE Conference on Granular Computing, May 2006, pp 629-634. Yu Meng and Margaret H. Dunham, “Online Mining of Risk Level of Traffic Anomalies with User's Feedbacks,” Proceedings of the IEEE Conference on Granular Computing, May 2006, pp 176-181. Yu Meng and Margaret H. Dunham, “Mining Developing Trends of Dynamic Spatiotemporal Data Streams,” Journal of Computers, Vol 1, No 3, June 2006, pp 43-50. Charlie Isaksson, Yu Meng, and Margaret H. Dunham, “Risk Leveling of Network Traffic Anomalies,” International Journal of Computer Science and Network Security, Vol 6, No 6, June 2006, pp 258-265. Margaret H. Dunham, Donya Quick, Yuhang Wang, Monnie McGee, Jim Waddle, “Visualization of DNA/RNA Structure using Temporal CGRs,”Proceedings of the IEEE 6th Symposium on Bioinformatics & Bioengineering (BIBE06), October 16-18, 2006, Washington D.C. ,pp 171-178. Charlie Isaksson and Margaret H. Dunham, “A Comparative Study of Outlier Detection,” 2009, accepted to appear LDM conference, 2009.

More Related