1 / 32

Exploring the connection between sampling problems in Bayesian inference and statistical mechanics

Exploring the connection between sampling problems in Bayesian inference and statistical mechanics. Andrew Pohorille NASA-Ames Research Center. Outline. Enhanced sampling of pdfs Dynamical systems Stochastic kinetics. flat histograms multicanonical method Wang-Landau

ginny
Download Presentation

Exploring the connection between sampling problems in Bayesian inference and statistical mechanics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Exploring the connection between sampling problems in Bayesian inference and statistical mechanics Andrew Pohorille NASA-Ames Research Center

  2. Outline • Enhanced sampling of pdfs • Dynamical systems • Stochastic kinetics flat histograms multicanonical method Wang-Landau transition probability method parallel tempering

  3. Enhanced sampling techniques

  4. Preliminaries define: variables x, , N a function U(x,,N) a probability: energies are Boltzmann-distributed  = 1/kT partition function Q(x,,N) marginalize x define “free energy” or “thermodynamic potential”

  5. What to do if is difficult to estimate because we can’t get sufficient statistics for all of interest. The problem:

  6. Flat histogram approach pdf sampled uniformly for all, N weight 

  7. Example: original pdf weighted pdf marginalization “canonical” partition function get  get Q

  8. General MC sampling scheme insertion deletion adjust weights free energy adjust free energy insertion deletion

  9. Multicanonical method normalization of  bin count shift Berg and Neuhaus, Phys. Rev. Lett. 68, 9 (1992)

  10. The algorithm • Start with any weights (e.g. 1(N) = 0) • Perform a short simulation and measure P(N; 1) as histogram • Update weights according to • Iterate until P(N; 1) is flat or better

  11. Typical example

  12. Wang-Landau sampling Example: estimate entropies for (discrete) states entropy acceptance criterion update constant Wang and Landau, Phys. Rev. Lett. 86, 2050 (2001), Phys. Rev. E 64, 056101 (2001)

  13. The algorithm • Set entropies of all states to zero; set initial g • Accept/reject according to the criterion: • Always update the entropy estimate for the end state • When the pdf is flat reduce g

  14. J I i j K Transition probability method Wang, Tay, Swendsen, Phys. Rev. Lett., 82 476 (1999) Fitzgerald et al. J. Stat. Phys. 98, 321 (1999)

  15. detailed balance macroscopic detailed balance

  16. Parallel tempering

  17. Dynamical systems

  18. Assumption -ergodicity The idea: the system evolves according to equations of motion (possibly Hamiltonian) we need to assign masses to variables 

  19. Advantages • No need to design sampling techniques • Specialized methods for efficient sampling are available (Laio-Parrinello, Adaptive Biasing Force) • No probabilistic sampling • Possibly complications with assignment of masses Disadvantages

  20. Two formulations: • Hamiltonian • Lagrangian Numerical, iterative solution of equations of motion (a trajectory)

  21. Assignment of masses Energy equipartition needs to be addressed • Masses too large - slow motions • Masses too small - difficult integration of equations of motion • Large separation of masses - adiabatic separation Thermostats are available Lagrangian - e.g. Nose-Hoover Hamiltonian - Leimkuhler

  22. Adaptive Biasing Force A= ab ∂H()/∂ *d * force A Darve and Pohorile, J. Chem. Phys. 115:9169-9183 (2001).

  23. Summary • A variety of techniques are available to sample efficiently rarely visited states. • Adaptive methods are based on modifying sampling while building the solution. • One can construct dynamical systems to seek the solution and efficient adaptive techniques are available. But one needs to do it carefully.

  24. Stochastic kinetics

  25. The problem • {Xi} objects, i = 1,…N • ni copies of each objects • undergo r transformations • With rates {k},  = 1,…r • {k} are constant • The process is Markovian (well-stirred reactor) Assumptions

  26. Example 7 objects 5 transformations

  27. Deterministic solution kinetics (differential equations) concentrations steady state (algebraic equations) Works well for large {ni} (fluctuations suppressed)

  28. next reaction is  at time  next reaction is  at any time any reaction at time  A statistical alternative • generate trajectories • which reaction occurs next? • when does it occur?

  29. Direct method - Algorithm • Initialization • Calculate the propensities {ai} • Choose  (r.n.) • Choose  (r.n.) • Update no. of molecules and reset tt+  • Go to step 2 Gillespie, J. Chem. Phys. 81, 2340 (1977)

  30. First reaction method -Algorithm • Initialization • Calculate the propensities {ai} • For each  generate  according to (r.n.) • Choose reaction for which is  the shortest • Set =  • Update no. of molecules and reset tt+  • Go to step 2 Gillespie, J. Chem. Phys. 81, 2340 (1977)

  31. Next reaction method Complexity - O(log r) Gibson and Bruck, J. Phys. Chem. A 104 1876 (2000)

  32. Extensions • k = k(t) (GB) • Non-Markovian processes (GB) • Stiff reactions (Eric van den Eijden) • Enzymatic reactions (A.P.)

More Related