1 / 56

Welcome to Amsterdam!

Welcome to Amsterdam!. Welcome to Amsterdam!. Bayesian Modeling for Cognitive Science: A WinBUGS Workshop. Contributors. Michael Lee http://www.socsci.uci.edu/~mdlee/. Contributors. Dora Matzke http://dora.erbe-matzke.com/. Contributors. Ruud Wetzels http://www.ruudwetzels.com/.

addo
Download Presentation

Welcome to Amsterdam!

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Welcome to Amsterdam!

  2. Welcome to Amsterdam!

  3. Bayesian Modeling for Cognitive Science: A WinBUGS Workshop

  4. Contributors Michael Lee http://www.socsci.uci.edu/~mdlee/

  5. Contributors Dora Matzke http://dora.erbe-matzke.com/

  6. Contributors Ruud Wetzels http://www.ruudwetzels.com/

  7. Contributors EJ Wagenmakers http://www.ejwagenmakers.com/

  8. Assistants Don van Ravenzwaaij http://www.donvanravenzwaaij.com

  9. Assistants Gilles Dutilh http://gillesdutilh.com/

  10. Assistants Helen Steingröver

  11. Why We Like Bayesian Modeling • It is fun. • It is cool. • It is easy. • It is principled. • It is superior. • It is useful. • It is flexible.

  12. Our Goals This Week Are… • For you to experience some of the possibilities that WinBUGS has to offer. • For you to get some hands-on training by trying out some programs. • For you to work at your own pace. • For you to get answers to questions when you get stuck.

  13. Our Goals This WeekAre NOT… • For you to become a Bayesian graphical modeling expert in one week. • For you to gain deep insight in the statistical foundations of Bayesian inference. • For you to get frustrated when the programs do not work or you do not understand the materials (please ask questions).

  14. Logistics • You should now have the course book, information on how to get wireless access, and a USB stick. The stick contains a pdf of the book and the computer programs.

  15. Logistics • Brief plenary lectures are at 09:30 and 14:00. • All plenary lectures are in this room. • All practicals are in the computer rooms on the next floor. • Coffee and tea are available in the small opposite the computer rooms.

  16. What is Bayesian Inference?Why be Bayesian?

  17. What is Bayesian Inference?

  18. What is Bayesian Inference? “Common sense expressed in numbers”

  19. What is Bayesian Inference? “The only statistical procedure that is coherent, meaning that it avoids statements that are internally inconsistent.”

  20. What is Bayesian Inference? “The only good statistics”

  21. Outline • Bayes in a Nutshell • The Bayesian Revolution • This Course

  22. Bayesian Inferencein a Nutshell • In Bayesian inference, uncertainty or degree of belief is quantified by probability. • Prior beliefs are updated by means of the data to yield posterior beliefs.

  23. Bayesian Parameter Estimation: Example • We prepare for you a series of 10 factual questions of equal difficulty. • You answer 9 out of 10 questions correctly. • What is your latent probability θof answering any one question correctly?

  24. Bayesian Parameter Estimation: Example • We start with a prior distribution for θ. This reflect all we know about θ prior to the experiment. Here we make a standard choice and assume that all values of θ are equally likely a priori.

  25. Bayesian Parameter Estimation: Example • We then update the prior distribution by means of the data (technically, the likelihood)to arrive at a posterior distribution. • The posterior distribution is a compromise between what we knew before the experiment and what we have learned from the experiment. The posterior distribution reflects all that we know about θ.

  26. Mode = 0.9 95% confidence interval: (0.59, 0.98)

  27. Outline • Bayes in a Nutshell • The Bayesian Revolution • This Course

  28. The Bayesian Revolution • Until about 1990, Bayesian statistics could only be applied to a select subset of very simple models. • Only recently, Bayesian statistics has undergone a transformation; With current numerical techniques, Bayesian models are “limited only by the user’s imagination.”

  29. The Bayesian Revolutionin Statistics

  30. The Bayesian Revolutionin Statistics

  31. Why Bayes is Now Popular Markov chain Monte Carlo!

  32. Markov Chain Monte Carlo • Instead of calculating the posterior analytically, numerical techniques such as MCMC approximate the posterior by drawing samples from it. • Consider again our earlier example…

  33. Mode = 0.89 95% confidence interval: (0.59, 0.98) With 9000 samples, almost identical to analytical result.

  34. Want to Know MoreAbout MCMC?

  35. MCMC • WithMCMC, the models you can build and estimate are said to be “limited only by the user’s imagination”. • But how do you get MCMC to work? • Option 1: write the code it yourself. • Option 2: use WinBUGS!

  36. Outline • Bayes in a Nutshell • The Bayesian Revolution • This Course

  37. Bayesian Cognitive Modeling:A Practical Course • …is a course book under development, used at several universities. • …is still regularly updated. • …will eventually be published by Cambridge University Press. • …greatly benefits from your suggestions for improvement! [e.g., typos, awkward sentences, new exercises, new applications, etc.]

  38. Bayesian Cognitive Modeling:A Practical Course • …requires you to run computer code. Do not mindlessly copy-paste the code, but study it first, and try to discover why it does its job. • …did not print very well (i.e., the quality of some of the pictures is below par). You will receive a better version tomorrow!

  39. WinBUGS Bayesian inference UsingGibbs Sampling You want to have thisinstalled (plus the registration key)

  40. WinBUGS • Knows many probability distributions (likelihoods); • Allows you to specify a model; • Allows you to specify priors; • Will then automatically run the MCMC sampling routines and produce output.

  41. WinBUGS knows many statistical distributions (e.g., the binomial distribution,the Gaussian distribution, the Poisson distribution). These distributions form the elementary building blocks from which you may construct infinitely many models.

  42. WinBUGS & R • WinBUGS produces MCMC samples. • We want to analyze the output in a nice program, such as R or Matlab. • This can be accomplished using the R package “R2WinBUGS”, or the Matlab function “matbugs”.

  43. R: “Here are the data and abunch of commands” WinBUGS: “OK, I did what you wanted, here’s the samples you asked for”

  44. Matlab: “Here are the data and abunch of commands” WinBUGS: “OK, I did what you wanted, here’s the samples you asked for”

More Related