1 / 25

Abstract

Perceptual inference and learning Collège de France 2008. Abstract

ciara
Download Presentation

Abstract

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Perceptual inference and learning Collège de France 2008 Abstract We start with a statistical formulation of Helmholtz’s ideas about neural energy to furnish a model of perceptual inference and learning that can explain a remarkable range of neurobiological facts. Using constructs from statistical physics it can be shown that the problems of inferring what cause our sensory inputs and learning causal regularities in the sensorium can be resolved using exactly the same principles. Furthermore, inference and learning can proceed in a biologically plausible fashion. The ensuing scheme rests on Empirical Bayes and hierarchical models of how sensory information is generated. The use of hierarchical models enables the brain to construct prior expectations in a dynamic and context-sensitive fashion. This scheme provides a principled way to understand many aspects of the brain’s organization and responses.

  2. Overview • Inference and learning under the free energy principle • Hierarchical Bayesian inference • A simple experiment • Bird songs (inference) • Structural and dynamic priors • Prediction and omission • Perceptual categorisation • Bird songs (learning) • Repetition suppression • The mismatch negativity

  3. Exchange with the environment Sensation External states Internal states environment agent - m Action Separated by a Markov blanket

  4. The free-energy principle Action to minimise a bound on surprise Perception to optimise the bound The ensemble density and its parameters Perceptual inference Perceptual learning Perceptual uncertainty

  5. Hierarchical models and message passing Top-down messages Bottom-up messages time Prediction error Recognition Generation

  6. Empirical Bayes and hierarchical models Bottom-up Lateral D-Step Perceptual inference Recurrent message passing among neuronal populations, with top-down predictions changing to suppress bottom-up prediction error Top-down E-Step Perceptual learning Associative plasticity, modulated by precision M-Step Perceptual uncertainty Encoding of precision through classical neuromodulation or plasticity in lateral connections Friston K Kilner J Harrison L A free energy principle for the brain.J. Physiol. Paris. 2006

  7. double bouquet cells superficial pyramidal cells Bottom-up prediction errors spiny stellate and small basket neurons in layer 4 deep pyramidal cells Top-down predictions Neural implementation in cortical hierarchies (c.f. evidence accumulation models)

  8. Overview • Inference and learning under the free energy principle • Hierarchical Bayesian inference • A simple experiment • Bird songs (inference) • Structural and dynamic priors • Prediction and omission • Perceptual categorisation • Bird songs (learning) • Repetition suppression • The mismatch negativity

  9. CRF V1 ~1o Horizontal V1 ~2o Feedback V2 ~5o Feedback V3 ~10o A brain imaging experiment with sparse visual stimuli V1 Random and unpredictable Extra-classical receptive field ? V2 Coherent and predicable Classical receptive field V2 Classical receptive field V1 top-down suppression of prediction error when coherent? Angelucci et al

  10. Suppression of prediction error with coherent stimuli Random Stationary Coherent V1 V5 V2 decreases increases pCG pCG V5 V2 V1 V5 regional responses (90% confidence intervals) Harrison et al NeuroImage 2006

  11. Overview • Inference and learning under the free energy principle • Hierarchical Bayesian inference • A simple experiment • Bird songs (inference) • Structural and dynamic priors • Prediction and omission • Perceptual categorisation • Bird songs (learning) • Repetition suppression • The mismatch negativity

  12. Synthetic song-birds Syrinx Neuronal hierarchy

  13. Song recognition with DEM

  14. … and broken birds

  15. … omitting the last chirps

  16. omission and violation of predictions Stimulus but no percept Percept but no stimulus

  17. Overview • Inference and learning under the free energy principle • Hierarchical Bayesian inference • A simple experiment • Bird songs (inference) • Structural and dynamic priors • Prediction and omission • Perceptual categorisation • Bird songs (learning) • Repetition suppression • The mismatch negativity

  18. A simple song Encoding sequences in terms of attractor manifolds

  19. Categorizing sequences 90% confidence regions

  20. Overview • Inference and learning under the free energy principle • Hierarchical Bayesian inference • A simple experiment • Bird songs (inference) • Structural and dynamic priors • Prediction and omission • Perceptual categorisation • Bird songs (learning) • Repetition suppression • The mismatch negativity

  21. Repetition suppression and the MMN The MMN is an enhanced negativity seen in response to any change (deviant) compared to the standard response. Main effect of faces Suppression of inferotemporal responses to repeated faces Henson et al 2000

  22. prediction and error hidden states 35 2 30 25 1 20 15 0 10 5 -1 0 -5 -2 10 20 30 40 50 60 10 20 30 40 50 60 time time causes - level 2 0.5 5000 0.4 4500 0.3 4000 0.2 3500 Frequency (Hz) 0.1 3000 0 2500 -0.1 2000 -0.2 10 20 30 40 50 60 0.1 0.2 0.3 0.4 time Time (sec) A simple chirp Prediction error encoded by superficial pyramidal cells

  23. Simulating ERPs to repeated chirps Perceptual inference: suppressing error over peristimulus time Perceptual learning: suppression over repetitions

  24. The MMN Last presentation (after learning) First presentation (before learning) Enhanced N1 (primary area) MMN (secondary area) P300 (tertiary area)?

  25. Summary • A free energy principle can account for several aspects of action and perception • The architecture of cortical systems speak to hierarchical generative models • Estimation of hierarchical dynamic models corresponds to a generalised deconvolution of inputs to disclose their causes • This deconvolution can be implemented in a neuronally plausible fashion by constructing a dynamic system that self-organises when exposed to inputs to suppress its free energy • Minimisation of free energy proceeds over many spaces, including the state of a model (perception), its parameters (learning), its hyperparameters (salience and attention) and the model itself (selection in somatic or evolutionary time).

More Related