1 / 21

HUMAN AND SYSTEMS ENGINEERING:

HUMAN AND SYSTEMS ENGINEERING:. Gentle Introduction to Particle Filtering. Sanjay Patil 1 and Ryan Irwin 2 Graduate research assistant 1 , REU undergrad 2 Human and Systems Engineering URL: www.isip.msstate.edu/publications/seminars/msstate/2005/particle /. Abstract. Particle Filtering:

ellema
Download Presentation

HUMAN AND SYSTEMS ENGINEERING:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. HUMAN AND SYSTEMS ENGINEERING: Gentle Introduction to Particle Filtering Sanjay Patil1 and Ryan Irwin2 Graduate research assistant1, REU undergrad2 Human and Systems Engineering URL: www.isip.msstate.edu/publications/seminars/msstate/2005/particle/

  2. Abstract • Particle Filtering: • Most conventional techniques for speech analysis are based on modeling signals as Gaussian Mixture Models in Hidden Markov Model based systems. • To overcome the mismatched channel conditions, and/or significantly reduce the complexity of the models, Nonlinear approaches are expected to perform better than the conventional techniques. • Particle filters, based on sequential Monte Carlo methods, is one such nonlinear methods. • Particle filtering allows complete presentation of the posterior distribution of the states. Statistical estimates can be computed easily even in the presence of nonlinearities.

  3. Outline of Presentation • Nonlinear Methods – necessity • Drawing Samples from a Probability distribution. (introduce ‘Particle’) • Sequential Monte Carlo Methods – necessity, different names – bootstrap, condensation algorithm, survival of the fittest. • Steps in particle filtering (explaining the algorithm – block schematic) • Actual example – (along with all the steps) • Brief review and applications for tracking • As can be applied to Speaker Verification • Demo

  4. Drawing samples from a probability distribution function • Concept of samples and its weights 200 samples • Take p(x)=Gamma(4,1) • Generate some random samples • Plot basic approximation to pdf • Each sample is called as ‘Particle’ 500 samples 5000 samples

  5. Particle filtering - • Condensation Algorithm • Survival of the fittest • Different Names – • Sequential Monte Carlo filters • Bootstrap filters General Problem Statement – Filtering – estimation of the states • Tracking the state (parameters or hidden variables) as it evolves over time • Sequentially arriving (noisy and non-Gaussian) observations • Idea is to have best possible estimate of hidden variables

  6. Particle filtering algorithm continue… General two-stage Framework (Prediction-Update stages) • Assume that pdf p(xk-1 | y1:k-1) is available at time k -1. • Prediction stage: • This is the prior of the state at time k ( without the information on measurement). Thus, it is the probability of the the state given only the previous measurements • Update stage: • This is posterior pdf from predicted prior pdf and newly available measurement.

  7. Particle filtering algorithm step-by-step (1)

  8. Particle filtering step-by-step (2)

  9. Particle filtering step-by-step (3)

  10. Particle filtering step-by-step (4)

  11. Particle filtering step-by-step (5)

  12. Particle filtering step-by-step (6)

  13. Particle filtering - visualization • Drawing samples • Predicting next state • Updating this state • What is THIS STEP??? • Resampling….

  14. Sampling Importance Resample algorithm (necessity)

  15. Applications: • All the applications are mostly tracking applications in different forms…. • Visual Tracking – e.g. human motion (body parts) • Prediction of (financial) time series – e.g. mapping gold price, stocks • Quality control in semiconductor industry • Military Applications • Target recognition from single or multiple images • Guidance of missiles • What is the application for IES NSF funded project – • Time series estimation for speech signal (Java demo) • Speaker Verification (details on next slide)

  16. Pattern Recognition Applet • Java applet that gives a visual of algorithms implemented at IES • Classification of Signals: • PCA - Principle Component Analysis • LDA - Linear Discrimination Analysis • SVM - Support Vector Machines • RVM - Relevance Vector Machines • Tracking of Signals • LP - Linear Prediction • KF - Kalman Filtering • PF – Particle Filtering

  17. Pattern Classification • Different data sets need to be differentiated without looking at all the data samples • Classifications distinguishes between sets of data without the samples • Algorithms separate data sets with a line of discrimination • To have zero error the line of discrimination should completely separate the classes • These patterns are easy to classify

  18. Pattern Classification • Toroidals are not classified very successfully with a straight line • Error should be around 50% because half of each class is separated • A proper line of discrimination of a toroidal would be a circle enclosing only the inside set

  19. Signal Tracking • The input signals are now time based with the x-axis representing time • All the signal tracking algorithms are implemented with interpolated data • The interpolation ensures that the input samples are at regular intervals • Sampling is always done on regular intervals • The linear prediction algorithm is a linear way to predict signals with no noise

  20. Signal Tracking • The Kalman filter and particle filter are based on prediction of the states of the signal • States are related to the observations through the state equation • The particle filtering algorithm introduces process and measurement noise • At each iteration possible states are given by the black points • The average of the black points is where the overall state is predicted to be

  21. References: • S. Haykin and E. Moulines, "From Kalman to Particle Filters," IEEE International Conference on Acoustics, Speech, and Signal Processing, Philadelphia, Pennsylvania, USA, March 2005. • M.W. Andrews, "Learning And Inference In Nonlinear State-Space Models," Gatsby Unit for Computational Neuroscience, University College, London, U.K., December 2004. • P.M. Djuric, J.H. Kotecha, J. Zhang, Y. Huang, T. Ghirmai, M. Bugallo, and J. Miguez, "Particle Filtering," IEEE Magazine on Signal Processing, vol 20, no 5, pp. 19-38, September 2003. • N. Arulampalam, S. Maskell, N. Gordan, and T. Clapp, "Tutorial On Particle Filters For Online Nonlinear/ Non-Gaussian Bayesian Tracking," IEEE Transactions on Signal Processing, vol. 50, no. 2, pp. 174-188, February 2002. • R. van der Merve, N. de Freitas, A. Doucet, and E. Wan, "The Unscented Particle Filter," Technical Report CUED/F-INFENG/TR 380, Cambridge University Engineering Department, Cambridge University, U.K., August 2000. • S. Gannot, and M. Moonen, "On The Application Of The Unscented Kalman Filter To Speech Processing," International Workshop on Acoustic Echo and Noise, Kyoto, Japan, pp 27-30, September 2003. • J.P. Norton, and G.V. Veres, "Improvement Of The Particle Filter By Better Choice Of The Predicted Sample Set," 15th IFAC Triennial World Congress, Barcelona, Spain, July 2002. • J. Vermaak, C. Andrieu, A. Doucet, and S.J. Godsill, "Particle Methods For Bayesian Modeling And Enhancement Of Speech Signals," IEEE Transaction on Speech and Audio Processing, vol 10, no. 3, pp 173-185, March 2002.

More Related