1 / 29

Spike Train decoding

Spike Train decoding. Spike Train decoding. Spike Train decoding. Summary. Decoding of stimulus from response Two choice case Discrimination ROC curves Population decoding MAP and ML estimators Bias and variance Fisher information, Cramer-Rao bound Spike train decoding. Chapter 4.

janl
Download Presentation

Spike Train decoding

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Spike Train decoding

  2. Spike Train decoding

  3. Spike Train decoding

  4. Summary Decoding of stimulus from response Two choice case Discrimination ROC curves Population decoding MAP and ML estimators Bias and variance Fisher information, Cramer-Rao bound Spike train decoding

  5. Chapter 4

  6. Entropy

  7. Entropy

  8. Mutual information H_noise< H

  9. Mutual information

  10. KL divergence

  11. Continuous variables

  12. Entropy maximization

  13. Entropy maximization

  14. Population of neurons

  15. Retinal Ganglion Cell Receptive Fields

  16. Retinal Ganglion Cell Receptive Fields

  17. Retinal Ganglion Cell Receptive Fields

  18. Retinal Ganglion Cell Receptive Fields

  19. Retinal Ganglion Cell Receptive Fields

  20. Temporal processing in LGN

  21. Temporal processing in LGN

  22. Temporal processing in LGN

  23. Temporal vs spatial coding

  24. Entropy of spike trains

  25. Entropy of spike trains

  26. Entropy of spike trains

  27. Entropy of spike trains • Spike train mutual information measurements quantify stimulus specific aspects of neural encoding. • Mutual information of bullfrog peripheral auditory neurons was estimated • 1.4 bits/sec for broadband noise stimulus • 7.8 bits/sec for bullfrog call-like stimulus

  28. Summary • Information theory quantifies how much a response says about a stimulus • Stimulus, response entropy • Noise entropy • Mutual information, KL divergence • Maximizing information transfer yields biological receptive fields • Factorial codes • Equalization • Whitening • Spike train mutual information

More Related