1 / 15

Comments on Hierarchical models, and the need for Bayes

IWSM, Chania, July 2002. Comments on Hierarchical models, and the need for Bayes. Peter Green, University of Bristol, UK P.J.Green@bristol.ac.uk. Complex data structures. Multiple sources of variability >1 strata Measurement error, indirect observation Random effects, latent variables

khuong
Download Presentation

Comments on Hierarchical models, and the need for Bayes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. IWSM, Chania, July 2002 Comments on Hierarchical models, and the need for Bayes Peter Green, University of Bristol, UK P.J.Green@bristol.ac.uk

  2. Complex data structures • Multiple sources of variability • >1 strata • Measurement error, indirect observation • Random effects, latent variables • Hierarchical population structure (multi-level models) • Experimental regimes, missing data

  3. Complex data structures, ctd. • …. all features prevalent in complex biomedical data, especially •  need for Hierarchical Models • e.g. many talks here at IWSM17 • generalised linear models are just not enough (and that’s not because of linearity or exponential families)

  4. Inference in hierarchical models • it is important to • plug-in estimates generally lead to under-estimating variability of quantities of interest - AVOID! • we need a coherent calculus of uncertainty propagate all sources of variability

  5. Inference in hierarchical models, ctd. • a coherent calculus of uncertainty? • we have one - it’s called Probability! •  full probability modelling of all variables • reported inference: joint distribution of unknowns of interest, given observed data • how? Bayes’ theorem

  6. Costs and benefits • costs • more modelling work • computational issues (?) • benefits • valid analysis • avoiding ad-hoc decisions • counts all data once and once only!

  7. Costs and benefits, ctd. • by-product • simultaneous, coherent inference about multiple targets • and the old question: what about sensitivity to prior assumptions? • if sensitivity analysis reveals strong dependence on prior among reasonable prior choices, how can you trust the non-Bayesian analysis?

  8. A simple prediction problem (an example that plugging in is wrong) • We make 10 observations; their mean is 15 and standard deviation 2. • What is the chance that the next observation will be more than 19?

  9. … prediction, continued • Can’t do much without assumptions - let’s suppose the data are normal…….. • …. 19 is 2 s.d.’s more than the mean, and the normal distribution probability of that is 2.3%

  10. … prediction, continued • But this supposes that 15 and 2 are the population mean and s.d. • We ought to allow for our uncertainty in these numbers - they are only estimates • This is awkward to do for a non-Bayesian

  11. … prediction, continued The Bayesian answer - (1) if the mean  and s.d.  were known, the answer would be 1-((19-)/) (2) we should average this quantity over the posterior distribution of (,) - I did this and got 4.5% - twice the ‘plug-in’ answer!

  12. Summary (1) • Bayes inference is completely sound mathematically - ‘coherent’ • All your conclusions are self-consistent • Handles prediction properly • Allows sequential updating • No logical somersaults (confidence intervals, hypothesis tests) • Bayes estimators are often more accurate

  13. Summary (2) • But it does require more input than just the data • Sensitivity to priors should be checked • Computation is an issue except in very simple problems - that’s true for non-Bayes too

  14. Reading • Migon, H.S. and Gamerman, D. Statistical Inference: an integrated approach. Arnold, 1999. • Box, G.E.P. and Tiao, G.C. Bayesian inference in Statistical Analysis. Addison-Wesley, 1973. • Carlin, B.P. and Louis, T.A. Bayes and empirical Bayes methods for data analysis. Chapman and Hall, 1996. • Gelman, A., Carlin, J.B., Stern, H.S. and Rubin, D.B. Bayesian data analysis. Chapman and Hall, 1995.

  15. Professor Peter Green Department of Mathematics University of Bristol Bristol BS8 1TW, UK tel: +44 117 928 7967 fax: 7999 P.J.Green@bristol.ac.uk

More Related