1 / 26

Learning Markov-chain embedded recurrence relations for chaotic time series analysis

Learning Markov-chain embedded recurrence relations for chaotic time series analysis. Jiann-Ming Wu, Ya-Ting Zhou, Chun-Chang Wu National Dong Hwa University Department of Applied Mathematics Hualien, Taiwan. Outline. Introduction High-order Markov processes for stochastic modeling

elliotc
Download Presentation

Learning Markov-chain embedded recurrence relations for chaotic time series analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Learning Markov-chain embedded recurrence relations for chaotic time series analysis Jiann-Ming Wu, Ya-Ting Zhou, Chun-Chang Wu National Dong Hwa University Department of Applied Mathematics Hualien, Taiwan

  2. Outline • Introduction • High-order Markov processes for stochastic modeling • Nonlinear recurrence relations for deterministic modeling • Recurrence relation approximation by supervised learning of radial or projective basis functions • Markov-chain embedded recurrence relations • Numerical Simulations • Conclusions

  3. High-order Markov assumption • Let Z[t] denote time series, where t is positive integers • High-order Markov assumption- • Given chaotic time series are oriented from a generative source well characterized by a high-order Markov process. • An order- Markov process obeys memory-less property • Current event only depends on instances of  most recently events instead of all historic events

  4. Recurrence relation • Conditional expectation of an upcoming event to  most recently events is expressed by a recurrence relation

  5. Recurrence relation for time series modeling target predictor

  6. Mackey-Glass 30 chaotic time series data Chaotic time series Laser data 10000 from the SFI competition

  7. RECURRENCE RELATION APPROXIMATION Learning neural networks for approximating underlying recurrence relation F denotes a mapping realized by radial or projective basis functions denotes adaptive network parameters

  8. Form paired predictor and target by assigning Define the mean square error of approximating Apply Levenberg-Marquardt learning to resolve unconstrained optimization Apply the proposed pair-data generative model to formulate F Recurrence relation approximation

  9. Pair-data generative model (PGM) K sub-models

  10. Mixtures of paired Gaussians A stochastic model for formation emulation of given paired data Each time one of joined pairs is selected according to a set of prior probabilities Apply the selected paired Gaussians to generate paired data

  11. Each pair is exactly generated by a sub-model Let denote the exclusive membership of where denotes a unitary vector with the ith bit active By exclusive membership The conditional expectation of y to given x is defined by r denotes local means of the target variable Exclusive Memberships

  12. Overlapping memberships A Potts random variable is applied to encode overlapping membership The probability of being the kth state is set to where modulates the overlapping degree and denotes local mean of the predictor

  13. The conditional expectation exactly sketches a mapping realized by normalized radial basis functions Normalized radial basis functions ( NRBF )

  14. Figure 4

  15. Mackey-Glass 17chaotic time series data Figure 9

  16. Multiple recurrence relations • Multiple recurrence relations for modeling more complex chaotic time series Chaotic time series Laser data 10000 from the SFI competition

  17. A Markov chain of PGMs (pair-data generative models) Transition matrix denotes the probability of transition from model i to model j Markov-chain embedded recurrence relations

  18. Emulate data generation by a stochastic Markov chain of PGMs Data generation

  19. Inverse problem of Markov chain embedded PGMs

  20. Segmentation for phase change A time tag is regarded as a switching point if its moving average error greater than a threshold value

  21. A simple rule for merging two PGMs The goodness of fitting the ith PGM to paired data in Sj is defined by Two PGMs are merged. Si and Sj are regarded from the same PGM if (Ei,j+Ej,i)/2 is less than a threshold value

  22. NUMERICAL SIMULATIONS – Synthetic data

  23. Temporal sequence generated by MC-embedded PGMs

  24. Numerical results – original and reconstructed MC-embedded PGMs

  25. Learning M=60,[K, , , N0 ] = [ 10, 10, 0.001, 500 ] Chaotic time series Laser data 10000 from the SFI competition Markov chain embedded recurrence relations Generated chaotic time series

  26. Conclusions • This work has presented learning Markov-chain embedded recurrence relations for complex time series analysis. • Levenberg-Marquardt supervised learning of neural networks has been shown potential for extracting essential recurrence relation underlying given time series • Markov-chain embedded recurrence relations are shown applicable for characterizing complex chaotic time series • The proposed systematic approach integrates pattern segmentation, hidden state absorption and transition probability estimation based on supervised learning of neural networks

More Related