1 / 26

J. Daunizeau Brain and Spine Institute, Paris, France

J. Daunizeau Brain and Spine Institute, Paris, France Wellcome Trust Centre for Neuroimaging, London, UK. Bayesian inference. Overview of the talk. 1 Probabilistic modelling and representation of uncertainty 1.1 Bayesian paradigm 1.2 Hierarchical models

debbie
Download Presentation

J. Daunizeau Brain and Spine Institute, Paris, France

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. J. Daunizeau Brain and Spine Institute, Paris, France WellcomeTrust Centre for Neuroimaging, London, UK Bayesian inference

  2. Overview of the talk 1 Probabilistic modelling and representation of uncertainty 1.1 Bayesian paradigm 1.2 Hierarchical models 1.3 Frequentist versus Bayesian inference 2 Numerical Bayesian inference methods 2.1 Sampling methods 2.2 Variational methods (ReML, EM, VB) 3 SPM applications 3.1 aMRI segmentation 3.2 Decoding of brain images 3.3 Model-based fMRI analysis (with spatial priors) 3.4 Dynamic causal modelling

  3. Overview of the talk 1 Probabilistic modelling and representation of uncertainty 1.1 Bayesian paradigm 1.2 Hierarchical models 1.3 Frequentist versus Bayesian inference 2 Numerical Bayesian inference methods 2.1 Sampling methods 2.2 Variational methods (ReML, EM, VB) 3 SPM applications 3.1 aMRI segmentation 3.2 Decoding of brain images 3.3 Model-based fMRI analysis (with spatial priors) 3.4 Dynamic causal modelling

  4. • normalization: a=2 • marginalization: • conditioning : (Bayes rule) b=5 a=2 Bayesian paradigmprobability theory: basics Degree of plausibilitydesiderata: - should be represented using real numbers (D1) - should conform with intuition (D2) - should be consistent (D3)

  5. Bayesian paradigmderiving the likelihood function - Model of data with unknown parameters: e.g., GLM: - But data is noisy: - Assume noise/residuals is ‘small’: f → Distribution of data, given fixed parameters:

  6. Likelihood: Prior: Bayes rule: Bayesian paradigmlikelihood, priors and the model evidence generative modelm

  7. posterior distribution inverse problem Bayesian paradigmforward and inverse problems forward problem likelihood

  8. Model evidence: y = f(x) x model evidence p(y|m) y=f(x) space of all data sets Bayesian paradigmmodel comparison Principle of parsimony : « plurality should not be assumed without necessity » “Occam’s razor”:

  9. inference causality Hierarchical modelsprinciple •••

  10. Hierarchical modelsdirected acyclic graphs (DAGs)

  11. • define the null, e.g.: if then reject H0 Frequentist versus Bayesian inferencea (quick) note on hypothesis testing • define two alternative models, e.g.: • estimate parameters (obtain test stat.) space of all datasets • apply decision rule, e.g.: • apply decision rule, i.e.: if then accept m0 classical SPM Bayesian Model Comparison

  12. Family-level inferencetrading model resolution against statistical power model selectionerrorrisk: P(m1|y) = 0.04 P(m2|y) = 0.25 B A B A P(m2|y) = 0.01 P(m2|y) = 0.7 B B A A u u

  13. Family-level inferencetrading model resolution against statistical power model selectionerrorrisk: P(m1|y) = 0.04 P(m2|y) = 0.25 B A B A familyinference (pool statisticalevidence) P(m2|y) = 0.01 P(m2|y) = 0.7 B B A A u u P(f1|y) = 0.05 P(f2|y) = 0.95

  14. Overview of the talk 1 Probabilistic modelling and representation of uncertainty 1.1 Bayesian paradigm 1.2 Hierarchical models 1.3 Frequentist versus Bayesian inference 2 Numerical Bayesian inference methods 2.1 Sampling methods 2.2 Variational methods (ReML, EM, VB) 3 SPM applications 3.1 aMRI segmentation 3.2 Decoding of brain images 3.3 Model-based fMRI analysis (with spatial priors) 3.4 Dynamic causal modelling

  15. Sampling methods MCMC example: Gibbs sampling

  16. Variational methods VB / EM / ReML → VB : maximize the free energyF(q)w.r.t. the approximate posteriorq(θ) under some (e.g., mean field, Laplace) simplifying constraint

  17. Overview of the talk 1 Probabilistic modelling and representation of uncertainty 1.1 Bayesian paradigm 1.2 Hierarchical models 1.3 Frequentist versus Bayesian inference 2 Numerical Bayesian inference methods 2.1 Sampling methods 2.2 Variational methods (ReML, EM, VB) 3 SPM applications 3.1 aMRI segmentation 3.2 Decoding of brain images 3.3 Model-based fMRI analysis (with spatial priors) 3.4 Dynamic causal modelling

  18. posterior probability maps (PPMs) segmentation and normalisation dynamic causal modelling multivariate decoding realignment smoothing general linear model Gaussian field theory statistical inference normalisation p <0.05 template

  19. class variances … ith voxel label ith voxel value class frequencies … class means grey matter white matter CSF aMRI segmentationmixture of Gaussians (MoG) model

  20. Decoding of brain imagesrecognizing brain states from fMRI fixation cross pace response + >> log-evidence of X-Y sparse mappings: effect of lateralization log-evidence of X-Y bilateral mappings: effect of spatial deployment

  21. prior variance of GLM coeff prior variance of data noise AR coeff (correlated noise) GLM coeff fMRI time series fMRI time series analysisspatial priors and model comparison PPM: regions best explained by short-term memory model short-term memory design matrix (X) PPM: regions best explained by long-term memory model long-term memory design matrix (X)

  22. attention estimated effective synaptic strengths for best model (m4) 15 0.10 PPC 0.39 0.26 10 1.25 0.26 stim V1 V5 0.13 5 models marginal likelihood 0.46 0 m1 m2 m3 m4 Dynamic Causal Modellingnetwork structure identification m1 m2 m3 m4 attention attention attention attention PPC PPC PPC PPC stim V1 stim V1 stim V1 stim V1 V5 V5 V5 V5

  23. DCMs and DAGsa note on causality 1 2 1 2 3 1 2 3 1 2 time 3 3 u

  24. Dynamic Causal Modellingmodel comparison for group studies m1 differences in log- model evidences m2 subjects fixed effect assume all subjects correspond to the same model random effect assume different subjects might correspond to different models

  25. I thank you for your attention.

  26. A note on statistical significancelessons from the Neyman-Pearson lemma • Neyman-Pearson lemma: the likelihood ratio (or Bayes factor) test • what is the threshold u, above which the Bayes factor testyields a error I rate of 5%? ROC analysis MVB (Bayes factor) u=1.09, power=56% 1 - error II rate is the most powerful test of size to test the null. CCA (F-statistics) F=2.20, power=20% error I rate

More Related