1 / 27

Importance Sampling

Importance Sampling. ICS 276 Fall 2007 Rina Dechter. Outline. Gibbs Sampling Advances in Gibbs sampling Blocking Cutset sampling (Rao-Blackwellisation) Importance Sampling Advances in Importance Sampling Particle Filtering. Importance Sampling Theory. Importance Sampling Theory.

fionn
Download Presentation

Importance Sampling

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Importance Sampling ICS 276 Fall 2007 Rina Dechter

  2. Outline • Gibbs Sampling • Advances in Gibbs sampling • Blocking • Cutset sampling (Rao-Blackwellisation) • Importance Sampling • Advances in Importance Sampling • Particle Filtering

  3. Importance Sampling Theory

  4. Importance Sampling Theory • Given a distribution called the proposal distribution Q (such that P(Z=z,e)>0=> Q(Z=z)>0) w(Z=z) is called as importance weight

  5. Importance Sampling Theory Underlying principle, Approximate Average over a set of numbers by an average over a set of sampled numbers

  6. Importance Sampling (Informally) • Express the problem as computing the average over a set of real numbers • Sample a subset of real numbers • Approximate the true average by sample average. • True Average: • Average of (0.11, 0.24, 0.55, 0.77, 0.88,0.99)=0.59 • Sample Average over 2 samples: • Average of (0.24, 0.77) = 0.505

  7. How to generate samples from Q • Express Q in product form: • Q(Z)=Q(Z1)Q(Z2|Z1)….Q(Zn|Z1,..Zn-1) • Sample along the order Z1,..Zn • Example: • Q(Z1)=(0.2,0.8) • Q(Z2|Z1)=(0.2,0.8,0.1,0.9) • Q(Z3|Z1,Z2)=Q(Z3|Z1)=(0.5,0.5,0.3,0.7)

  8. Q(Z1)=(0.2,0.8) • Q(Z2|Z1)=(0.2,0.8,0.1,0.9) • Q(Z3|Z1,Z2)=Q(Z3|Z1)=(0.5,0.5,0.3,0.7) How to sample from Q Domains of each variable is {0,1} • Generate a random number between 0 and 1 0 1 0.2 Which value to select for Z1? 1 0

  9. How to sample from Q? • Each Sample Z=z • Sample Z1=z1 from Q(Z1) • Sample Z2=z2 from Q(Z2|Z1=z1) • Sample Z3=z3 from Q(Z3|Z1=z1) • Generate N such samples

  10. Likelihood weighting • Q= Prior Distribution=CPTs of the Bayesian network

  11. Likelihood weighting example P(S) Smoking P(C|S) P(B|S) lung Cancer Bronchitis P(X|C,S) P(D|C,B) X-ray Dyspnoea P(S, C, B, X, D)= P(S) P(C|S) P(B|S) P(X|C,S) P(D|C,B)

  12. Likelihood weighting example P(S) Smoking Q=Prior Q(S,C,D)=Q(S)*Q(C|S)*Q(D|C,B=0) =P(S)P(C|S)P(D|C,B=0) P(C|S) P(B|S) lung Cancer Bronchitis Sample S=s from P(S) Sample C=c from P(C|S=s) Sample D=d from P(D|C=c,B=0) P(X|C,S) P(D|C,B) X-ray Dyspnoea

  13. The Algorithm

  14. How to solve belief updating?

  15. Difference between estimating P(E=e) and P(Xi=xi|E=e) Unbiased Asymptotically Unbiased

  16. Proposal Distribution: Which is better?

  17. Outline • Gibbs Sampling • Advances in Gibbs sampling • Blocking • Cutset sampling (Rao-Blackwellisation) • Importance Sampling • Advances in Importance Sampling • Particle Filtering

  18. Research Issues in Importance Sampling • Better Proposal Distribution • Likelihood weighting • Fung and Chang, 1990; Shachter and Peot, 1990 • AIS-BN • Cheng and Druzdzel, 2000 • Iterative Belief Propagation • Changhe and Druzdzel, 2003 • Iterative Join Graph Propagation and variable ordering • Gogate and Dechter, 2005

  19. Research Issues in Importance Sampling (Cheng and Druzdzel 2000) • Adaptive Importance Sampling

  20. Adaptive Importance Sampling • General case • Given k proposal distributions • Take N samples out of each distribution • Approximate P(e)

  21. Estimating Q'(z)

  22. Cutset importance sampling (Gogate and Dechter, 2005) and (Bidyuk and Dechter 2006) • Divide the Set of variables into two parts • Cutset (C) and Remaining Variables (R)

  23. Outline • Gibbs Sampling • Advances in Gibbs sampling • Blocking • Cutset sampling (Rao-Blackwellisation) • Importance Sampling • Advances in Importance Sampling • Particle Filtering

  24. Dynamic Belief Networks (DBNs) Transition arcs Xt Xt+1 Yt Yt+1 Bayesian Network at time t Bayesian Network at time t+1 X10 X0 X1 X2 Y10 Y0 Y1 Y2 Unrolled DBN for t=0 to t=10

  25. Query • Compute P(X 0:t |Y 0:t ) or P(X t |Y 0:t ) • Example P(X0:10|Y0:10) or P(X10|Y0:10) • Hard!!! over a long time period • Approximate! Sample!

  26. Particle Filtering (PF) • = “condensation” • = “sequential Monte Carlo” • = “survival of the fittest” • PF can treat any type of probability distribution, non-linearity, and non-stationarity; • PF are powerful sampling based inference/learning algorithms for DBNs.

  27. Particle Filtering On white board

More Related