1 / 14

Convergence of Sequential Monte Carlo Methods

Convergence of Sequential Monte Carlo Methods. Dan Crisan, Arnaud Doucet. Problem Statement. X: signal, Y: observation process X satisfies and evolves according to the following equation, Y satisfies. Bayes ’ recursion. Prediction Updating. A Sequential Monte Carlo Methods.

kaya
Download Presentation

Convergence of Sequential Monte Carlo Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Convergence of Sequential Monte Carlo Methods Dan Crisan, Arnaud Doucet

  2. Problem Statement • X: signal, Y: observation process • X satisfies and evolves according to the following equation, • Y satisfies

  3. Bayes’ recursion • Prediction • Updating

  4. A Sequential Monte Carlo Methods • Empirical measure • Transition kernel • Importance distribution • : abs. continuous with respect to • : strictly positive Radon Nykodym derivative • Then is also continuous w.r.t. and

  5. Algorithm • Step 1:Sequential importance sampling • sample: • evaluate normalized importance weights and let

  6. Step 2: Selection step • multiply/discard particles with high/low importance weights to obtain N particles let assoc.empirical measure • Step 3: MCMC step • sample ,where K is a Markov kernel of invariant distribution and let

  7. Convergence Study • denote • convergence to 0 of average mean square error under quite general conditions • Then prove (almost sure) convergence of toward under more restrictive conditions

  8. Bounds for mean square errors • Assumptions • 1.-A Importance distribution and weights • is assumed abs.continuous with respect to for all is a bounded function in argument define

  9. There exists a constant s. t. for all there exists with s.t. • There exists s. t. and a constant s.t.

  10. 2.-A Resampling/Selection scheme

  11. First Assumption ensures that • Importance function is chosen so that the corresponding importance weights are bounded above. • Sampling kernel and importance weights depend “ continuously” on the measure variable. • Second assumption ensures that • Selection scheme does not introduce too strong a “discrepancy”.

  12. Lemma 1 • Let us assume that for any then after step 1, for any • Lemma 2 • Let us assume that for any then for any

  13. Lemma 3 • Let us assume that for any then after step 2, for any • Lemma 4 • Let us assume that for any then for any

  14. Theorem 1 • For all , there exists independent of s.t. for any

More Related