1 / 15

Introduction of Markov Chain Monte Carlo

Introduction of Markov Chain Monte Carlo. Jeongkyun Lee. Contents. Usage Why MCMC is called MCMC MCMC methods Appendix Reference. Usage. Goal : 1) Estimate an unknown target distribution (or posterior) for a complex function, or 2) draw samples from the distribution. Simulation

eliza
Download Presentation

Introduction of Markov Chain Monte Carlo

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction of Markov Chain Monte Carlo Jeongkyun Lee

  2. Contents • Usage • Why MCMC is called MCMC • MCMC methods • Appendix • Reference

  3. Usage • Goal :1) Estimate an unknown target distribution (or posterior) for a complex function, or 2) draw samples from the distribution. • Simulation • Draw samples from a probability governed by a system. • Integration / computing • Integrate or compute a high dimensional function • Optimization / Bayesian inference • Ex. Simulated annealing, MCMC-based particle filter • Learning • MLE learning, unsupervised learning

  4. Why MCMC is called MCMC • Markov Chain • Markov process • For a random variable at time , the transition probabilities between different values depend only on the random variable’s current state,

  5. Why MCMC is called MCMC • Monte Carlo integration • To compute a complex integral, use random number generation to compute the integral. • Ex. Compute the pi.

  6. Why MCMC is called MCMC • Markov Chain Monte Carlo • Construct a Markov Chain representing a target distribution. • http://www.kev-smith.com/tutorial/flash/markov_chain.swf …

  7. MCMC Methods • Metropolis / Metropolis-Hastings algorithms • Draw samples from a distribution , where is a normalizing constant. • http://www.kev-smith.com/tutorial/flash/MH.swf Initial value satisfying Metropolis Metropolis-Hastings times Sample a candidate value from a proposal distribution is not symmetric Given the candidate calculate a probability With the probability , Accept or reject : a probability of a move

  8. MCMC Methods • Metropolis / Metropolis-Hastings algorithms • Iterated times. • Burn-in period: the period that chain approaches its stationary distribution. • Compute only the samples after the burn-in period, avoiding the approximation biased by starting position. • http://www.kev-smith.com/tutorial/flash/burnin.swf

  9. MCMC Methods • Gibbs Sampling • A special case of MH algorithm () • Draw samples for random variables sequentially from univariate conditional distributions.i.e. the value of -th variable is drawn from the distribution , where represents the values of the variables except for the -thvariable.

  10. MCMC Methods • Reversible Jump(or trans-dimensional) MCMC • When the dimension of the state is changed, • Additionally consider a move type.

  11. Appendix • Markov Chain property • Stationary distribution (or detailed balance) • Irreducible (all pi > 0) • Aperiodic

  12. Appendix • MH sampling as a Markov Chain • The transition probability kernel in the MH algorithmThus, if the MH kernel satisfiesthen the stationary distribution from this kernel corresponds to draws from the target distribution.

  13. Appendix • MH sampling as a Markov Chain

  14. Reference • http://vcla.stat.ucla.edu/old/MCMC/MCMC_tutorial.htm • http://www.kev-smith.com/tutorial/rjmcmc.php • http://www.cs.bris.ac.uk/~damen/MCMCTutorial.htm • B. Walsh, “Markov Chain Monte Carlo and Gibbs Sampling”, Lecture Notes, MIT, 2004

  15. Thank you!

More Related