1 / 28

Exploring subjective probability distributions using Bayesian statistics

Exploring subjective probability distributions using Bayesian statistics. Tom Griffiths Department of Psychology Cognitive Science Program University of California, Berkeley. Perception is optimal. Körding & Wolpert (2004). Cognition is not. Bayesian models of cognition.

elisha
Download Presentation

Exploring subjective probability distributions using Bayesian statistics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Exploring subjective probability distributions using Bayesian statistics Tom Griffiths Department of Psychology Cognitive Science Program University of California, Berkeley

  2. Perception is optimal Körding & Wolpert (2004)

  3. Cognition is not

  4. Bayesian models of cognition • What would an “ideal cognizer” do? • “rational analysis” (Anderson, 1990) • How can structured representations be combined with statistical inference? • graphical models, probabilistic grammars, etc. • What knowledge guides human inferences? • questions about priors and likelihoods

  5. Exploring subjective distributions Natural statistics in cognition (joint work with Josh Tenenbaum) Markov chain Monte Carlo with people (joint work with Adam Sanborn)

  6. Exploring subjective distributions Natural statistics in cognition (joint work with Josh Tenenbaum) Markov chain Monte Carlo with people (joint work with Adam Sanborn)

  7. Natural statistics Prior distributions Images of natural scenes p(x)

  8. Predicting the future How often is Google News updated? t = time since last update ttotal = time between updates What should we guess for ttotal given t?

  9. Bayesian inference p(ttotal|t)  p(t|ttotal) p(ttotal) posterior probability likelihood prior

  10. Bayesian inference p(ttotal|t)  p(t|ttotal) p(ttotal) p(ttotal|t)  1/ttotal p(ttotal) posterior probability likelihood prior assume random sample (0 < t < ttotal)

  11. The effects of priors

  12. Evaluating human predictions • Different domains with different priors: • a movie has made $60 million [power-law] • your friend quotes from line 17 of a poem [power-law] • you meet a 78 year old man [Gaussian] • a movie has been running for 55 minutes [Gaussian] • a U.S. congressman has served for 11 years [Erlang] • Prior distributions derived from actual data • Use 5 values of t for each • People predict ttotal

  13. people empirical prior Gott’s rule parametric prior

  14. Exploring subjective distributions Natural statistics in cognition (joint work with Josh Tenenbaum) Markov chain Monte Carlo with people (joint work with Adam Sanborn)

  15. Markov chain Monte Carlo

  16. Metropolis-Hastings algorithm(Metropolis et al., 1953; Hastings, 1970) Step 1: propose a state (we assume symmetrically) Q(x(t+1)|x(t)) = Q(x(t))|x(t+1)) Step 2: decide whether to accept, with probability Metropolis acceptance function Barker acceptance function

  17. A task Ask subjects which of two alternatives comes from a target category Which animal is a frog?

  18. A Bayesian analysis of the task Assume:

  19. Response probabilities If people probability match to the posterior, response probability is equivalent to the Barker acceptance function for target distribution p(x|c)

  20. Which is the frog? Which is the frog? Collecting the samples Which is the frog? Trial 1 Trial 2 Trial 3

  21. Sampling from natural categories Examined distributions for four natural categories: giraffes, horses, cats, and dogs Presented stimuli with nine-parameter stick figures (Olman & Kersten, 2004)

  22. Choice task

  23. Samples from Subject 3(projected onto a plane)

  24. Mean animals by subject S1 S2 S3 S4 S5 S6 S7 S8 giraffe horse cat dog

  25. Marginal densities (aggregated across subjects) Giraffes are distinguished by neck length, body height and body tilt Horses are like giraffes, but with shorter bodies and nearly uniform necks Cats have longer tails than dogs

  26. Markov chain Monte Carlo with people • Probabilistic models can guide the design of experiments to measure psychological variables • Markov chain Monte Carlo can be used to sample from subjective probability distributions • category distributions (Metropolis-Hastings) • prior distributions (Gibbs sampling) • Effective for exploring large stimulus spaces, with distributions on a small part of the space

  27. Conclusion • Probabilistic models give us a way to explore the knowledge that guides people’s inferences • Basic problem for both cognition and perception: identifying subjective probability distributions • Two strategies: • natural statistics • Markov chain Monte Carlo with people

More Related