bayes factor
Download
Skip this Video
Download Presentation
Bayes Factor

Loading in 2 Seconds...

play fullscreen
1 / 29

Bayes Factor - PowerPoint PPT Presentation


  • 105 Views
  • Uploaded on

Bayes Factor. Based on Han and Carlin (2001, JASA). Model Selection. Data: y : finite set of competing models : a distinct unknown parameter vector of dimension n j corresponding to the j th model Prior : all possible values for

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Bayes Factor' - ovid


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
bayes factor

Bayes Factor

Based on Han and Carlin (2001, JASA)

model selection
Model Selection
  • Data: y
  • : finite set of competing models
  • : a distinct unknown parameter vector of dimension nj corresponding to the jth model
  • Prior
  • : all possible values for
  • : collection of all model specific
model selection1
Model Selection
  • Posterior probability
    • A single “best” model
    • Model averaging
  • Bayes factor: Choice between two models
estimating marginal likelihood
Estimating Marginal Likelihood
  • Marginal likelihood
  • Estimation
    • Ordinary Monte Carlo sampling
      • Difficult to implement for high-dimensional models
      • MCMC does not provide estimate of marginal likelihood directly
    • Include model indicator as a parameter in sampling
      • Product space search by Gibbs sampling
      • Metropolis-Hastings
      • Reversible Jump MCMC
product space search
Product space search
  • Carlin and Chib (1995, JRSSB)
  • Data likelihood of Model j
  • Prior of model j
  • Assumption:
    • M is merely an indicator of which is relevant to y
    • Y is independent of given the model indicator M
    • Proper priors are required
    • Prior independence among given M
product space search1
Product space search
  • The sampler operates over the product space
  • Marginal likelihood
  • Remark:
bayes factor1
Bayes Factor
  • Provided the sampling chain for the model indicator mixes well, the posterior probability of model j can be estimated by
  • Bayes factor is estimated by
choice of prior probability
Choice of prior probability
  • In general, can be chosen arbitrarily
    • Its effect is divided out in the estimate of Bayes factor
  • Often, they are chosen so that the algorithm visits each model in roughly equal proportion
    • Allows more accurate estimate of Bayes factor
    • Preliminary runs are needed to select computationally efficient values
more remarks
More remarks
  • Performance of this method is optimized when the pseudo-priors match the corresponding model specific priors as nearly as possible
  • Draw back of the method
    • Draw must be made from each pseudo prior at each iteration to produce acceptably accurate results
    • If a large number of models are considered, the method becomes impractical
metropolized product space search
Metropolized product space search
  • Dellaportas P., Forster J.J., Ntzoufras I. (2002). On Bayesian Model and Variable Selection Using MCMC.  Statistics and Computing, 12, 27-36.
  • A hybrid Gibbs-Metropolis strategy
  • Model selection step is based on a proposal moving between models
advantage
Advantage
  • Only needs to sample from the pseudo prior for the proposed model
reversible jump mcmc
Reversible jump MCMC
  • Green (1995, Biometrika)
  • This method operates on the union space
  • It generates a Markov chain that can jump between models with parameter spaces of different dimensions
using partial analytic structure pas
Using Partial Analytic Structure (PAS)
  • Godsill (2001, JCGS)
  • Similar setup as in the CC method, but allows parameters to be shared between different models.
  • Avoids dimension matching
marginal likelihood estimation chib 19951
Marginal likelihood estimation (Chib 1995)
  • Let When all full conditional distributions for the parameters are in closed form
chib 1995
Chib (1995)
  • The first three terms on the right side are available in close form
  • The last term on the right side can be estimated from Gibbs steps
chib and jeliazkov 2001
Chib and Jeliazkov (2001)
  • Estimation of the last term requires knowing the normalizing constant
    • Not applicable to Metropolis-Hastings
  • Let the acceptance probability be
for mcc
For MCC
  • h(1,1)=h(1,2)=h(2,1)=h(2,2)=0.5
for rjmcmc
For RJMCMC
  • Dimension matching is automatically satisfied
  • Due to similarities between two models
chib s method
Chib’s method
  • Two block gibbs sampler
    • (regression coefficients, variance)
results
Results
  • By numerical integration, the true Bayes factor should be 4862 in favor of Model 2
ad