1 / 27

Neural Computation and Applications in Time Series and Signal Processing

Neural Computation and Applications in Time Series and Signal Processing. Georg Dorffner Dept. of Medical Cybernetics and Artificial Intelligence, University of Vienna And Austrian Research Institute for Artificial Intelligence. Neural Computation.

wyome
Download Presentation

Neural Computation and Applications in Time Series and Signal Processing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Neural Computationand Applications inTime Series and Signal Processing Georg Dorffner Dept. of Medical Cybernetics and Artificial Intelligence, University of Vienna And Austrian Research Institute for Artificial Intelligence Neural Computation for Time Series

  2. Neural Computation • Originally biologically motivated(information processing in the brain) • Simple mathematical model of the neuron neural network • Large number of simple „units“ • Massively parallel (in theory) • Complexity through the interplay of many simple elements • Strong relationship to methods from statistics • Suitable for pattern recognition Neural Computation for Time Series

  3. A Unit Activation, Output Weight • Propagation rule: • Weighted sum • Euclidian distance • Transfer function f: • Threshold fct.(McCulloch & Pitts) • Linear fct. • Sigmoid fct. • Gaussian fct. w1 Unit (Neuron)  yj f xj w2 … (Net-) Input wi Neural Computation for Time Series

  4. Multilayer Perceptron (MLP), Radial Basis Function Network (RBFN) • 2 (or more) layers (= connections) Output Units (typically linear) Hidden Units (typically nonlinear) Input Units MLP: RBFN: Neural Computation for Time Series

  5. move (bias) Stretch, mirror MLP as Universal Function Approximator • E.g,: 1 Input, 1 Output, 5 Hidden • MLP can approximate arbitrary functions (Hornik et al. 1990) • trough superposition of weighted sigmoids • Similar is true for RBFN Neural Computation for Time Series

  6. contribution of network contribution of error function target all patterns all outputs Training (Model Estimation) • Iterative optimisation based on gradient(gradient descent, conjugent gradient, quasi-Newton): • Typical error function: • „Backpropagation“ (application of chain rule): (summed squared error) Neural Computation for Time Series

  7. Recurrent Perceptrons • Recurrent connection = feedback loop • From hidden layer („Elman“) or output layer („Jordan“) copy Input Zustands- bzw. Kontextlayer Learning:„backpropagation through time“ Neural Computation for Time Series

  8. Time series processing • Given: time-dependent observables • Scalar: univariate; vector: multivariate • Typical tasks: • Pattern • recognition • Modeling • Forecasting • Noise modeling • Filtering • Source separation Signals(milliseconds to seconds) Time series(minutes to days) Neural Computation for Time Series

  9. Examples Standard & Poor‘s Sunspots Preprocessed: (de-seasoned) Preprocessed: (returns) Neural Computation for Time Series

  10. past observations Expected value Autoregressive models • Forecasting: making use of past information to predict (estimate) the future • AR: Past information = past observations Noise, „random shock“ • Best forecast: expected value Neural Computation for Time Series

  11. Linear AR models • Most common case: • Simplest form: random walk • Nontrivial forecast impossible Neural Computation for Time Series

  12. MLP as NAR • Neural network can approximate nonlinear AR model • „time window“ or „time delay“ Neural Computation for Time Series

  13. Distribution with expected value F(xi) Noise modeling • Regression is density estimation of:(Bishop 1995) • Likelihood: Target = future past Neural Computation for Time Series

  14. Gaussian noise • Likelihood: • Maximization = minimization of -logL(constant terms can be deleted, incl. p(x)) • Corresponds to summed squared error(typical backpropagation) Neural Computation for Time Series

  15. Complex noise models • Assumption: arbitrary distribution • Parameters are time dependent (dependent on past): • Likelihood: Probability density function for D Neural Computation for Time Series

  16. Heteroskedastic time series • Assumption: Noise is Gaussian with time-dependent variance • ARCH model • MLP is nonlinear ARCH (when applied to returns/residuals) Neural Computation for Time Series

  17. Non-Gaussian noise • Other parametric pdfs (e.g. t-distribution) • Mixture of Gaussians (Mixture density network, Bishop 1994) • Network with 3k outputs (or separate networks) Neural Computation for Time Series

  18. Identifiability problem • Mixture models (like neural networks) are not identifiable (parameters cannot be interpreted) • No distinction between model and noisee.g. sunspot data: •  Models have to be treated with care Neural Computation for Time Series

  19. Recurrent networks: Moving Average • Second model class: Moving Average models • Past information: random shocks • Recurrent (Jordan) network: Nonlinear MA • However, convergence notguaranteed Neural Computation for Time Series

  20. GARCH • Extension of ARCH: • Explains „volatility clustering“ • Neural network can again be a nonlinear version • Using past estimates: recurrent network Neural Computation for Time Series

  21. State space models • Observables depend on (hidden) time-variant state • Strong relationship to recurrent (Elman) networks • Nonlinear version only with additional hidden layers Neural Computation for Time Series

  22. Symbolic time series • Examples: • DNA • Text • Quantised time series (e.g. „up“ and „down“) • Past information: past p symbols  probability distribution • Markov chains • Problem: long substrings are rare alphabet Neural Computation for Time Series

  23. Fractal prediction machines • Similar subsequences are mapped to points close in space • Clustering = extraction of stochastic automaton Neural Computation for Time Series

  24. Relationship to recurrent network • Network of 2nd order Neural Computation for Time Series

  25. Other topics • Filtering:corresponds to ARMA modelsNN as nonlinear filters • Source separationindependent component analysis • Relationship to stochastic automata Neural Computation for Time Series

  26. Practical considerations • Stationarity is an important issue • Preprocessing (trends, seasonalities) • N-fold cross-validation time-wise(validation set must be after training set • Mean and standard deviation  model selection test train validation Neural Computation for Time Series

  27. Summary • Neural networks are powerful semi-parametric models for nonlinear dependencies • Can be considered as nonlinear extensions of classical time series and signal processing techniques • Applying semi-parametric models to noise modeling adds another interesting facet • Models must be treated with care, much data necessary Neural Computation for Time Series

More Related