1 / 86

TNI: Computational Neuroscience Instructors: Peter Latham Maneesh Sahani Peter Dayan

TNI: Computational Neuroscience Instructors: Peter Latham Maneesh Sahani Peter Dayan TAs: Arthur Guez, aguez@gatsby.ucl.ac.uk Marius Pachitariu, marius@gatsby.ucl.ac.uk Website: http://www.gatsby.ucl.ac.uk/~aguez/tn1/ Lectures: Tuesday/Friday, 11:00-1:00.

inga
Download Presentation

TNI: Computational Neuroscience Instructors: Peter Latham Maneesh Sahani Peter Dayan

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TNI: Computational Neuroscience Instructors: Peter Latham Maneesh Sahani Peter Dayan TAs: Arthur Guez, aguez@gatsby.ucl.ac.uk Marius Pachitariu, marius@gatsby.ucl.ac.uk Website: http://www.gatsby.ucl.ac.uk/~aguez/tn1/ Lectures: Tuesday/Friday, 11:00-1:00. Review: Tuesday, starting at 4:30. Homework: Assigned Friday, due Friday (1 week later). first homework: assigned Oct. 7, due Oct. 14.

  2. What is computational neuroscience? Our goal: figure out how the brain works.

  3. There are about 10 billion cubes of this size in your brain! 10 microns

  4. How do we go about making sense of this mess? David Marr (1945-1980) proposed three levels of analysis: 1. the problem (computational level) 2. the strategy (algorithmic level) 3. how it’s actually done by networks of neurons (implementational level)

  5. Example #1: memory. the problem: recall events, typically based on partial information.

  6. r3 r2 r1 activity space Example #1: memory. the problem: recall events, typically based on partial information. associative or content-addressable memory. an algorithm: dynamical systems with fixed points.

  7. Example #1: memory. the problem: recall events, typically based on partial information. associative or content-addressable memory. an algorithm: dynamical systems with fixed points. neural implementation: Hopfield networks. xi = sign(∑j Jij xj)

  8. Example #2: vision. the problem (Marr): 2-D image on retina → 3-D reconstruction of a visual scene.

  9. Example #2: vision. the problem (modern version): 2-D image on retina → recover the latent variables. house sun tree bad artist

  10. Example #2: vision. the problem (modern version): 2-D image on retina → recover the latent variables. house sun tree bad artist cloud

  11. Example #2: vision. the problem (modern version): 2-D image on retina → reconstruction of latent variables. an algorithm: graphical models. x1 x2 x3 latent variables r1 r2 r3 r4 low level representation

  12. Example #2: vision. the problem (modern version): 2-D image on retina → reconstruction of latent variables. an algorithm: graphical models. x1 x2 x3 latent variables inference r1 r2 r3 r4 low level representation

  13. Example #2: vision. the problem (modern version): 2-D image on retina → reconstruction of latent variables. an algorithm: graphical models. implementation in networks of neurons: no clue.

  14. Comment #1: the problem: the algorithm: neural implementation:

  15. Comment #1: the problem: easier the algorithm: harder neural implementation: harder often ignored!!!

  16. Comment #1: the problem: easier the algorithm: harder neural implementation: harder A common approach: Experimental observation → model Usually very underconstrained!!!!

  17. Comment #1: the problem: easier the algorithm: harder neural implementation: harder Example i: CPGs (central pattern generators) rate rate Too easy!!!

  18. Comment #1: the problem: easier the algorithm: harder neural implementation: harder Example ii: single cell modeling CdV/dt = -gL(V – VL) – n4(V – VK) … dn/dt = … … lots and lots of parameters … which ones should you use?

  19. Comment #1: the problem: easier the algorithm: harder neural implementation: harder Example iii: network modeling lots and lots of parameters × thousands

  20. r3 x1 x2 x3 r2 r1 r2 r3 r4 r1 activity space Comment #2: the problem: easier the algorithm: harder neural implementation: harder You need to know a lot of math!!!!!

  21. Comment #3: the problem: easier the algorithm: harder neural implementation: harder This is a good goal, but it’s hard to do in practice. Our actual bread and butter: 1. Explaining observations (mathematically) 2. Using sophisticated analysis to design simple experiments that test hypotheses.

  22. Comment #3: Two experiments: - record, using loose patch, from a bunch of cells in culture - block synaptic transmission - record again - found quantitative support for the balanced regime. J. Neurophys., 83:808-827, 828-835, 2000

  23. Comment #3: Two experiments: - perform whole cell recordings in vivo - stimulate cells with a current pulse every couple hundred ms - build current-triggered PSTH - showed that the brain is intrinsically very noisy, and is likely to be using a rate code. Nature, 466:123-127 (2010)

  24. Comment #4: the problem: easier the algorithm: harder neural implementation: harder some algorithms are easy to implement on a computer but hard in a brain, and vice-versa. these are linked!!!

  25. Comment #4: hard for a brain, easy for a computer: A-1 z=x+y ∫dx ... easy for a brain, hard for a computer: associative memory

  26. Comment #4: the problem: easier the algorithm: harder neural implementation: harder some algorithms are easy to implement on a computer but hard in a brain, and vice-versa. we should be looking for the vice-versa ones. it can be hard to tell which is which. these are linked!!!

  27. Basic facts about the brain

  28. Your brain

  29. Your cortex unfolded neocortex (cognition) 6 layers ~30 cm ~0.5 cm subcortical structures (emotions, reward, homeostasis, much much more)

  30. Your cortex unfolded 1 cubic millimeter, ~3*10-5 oz

  31. 1 mm3 of cortex: 50,000 neurons 10000 connections/neuron (=> 500 million connections) 4 km of axons

  32. 1 mm3 of cortex: 50,000 neurons 10000 connections/neuron (=> 500 million connections) 4 km of axons 1 mm2 of a CPU: 1 million transistors 2 connections/transistor (=> 2 million connections) .002 km of wire

  33. 1 mm3 of cortex: 50,000 neurons 10000 connections/neuron (=> 500 million connections) 4 km of axons whole brain (2 kg): 1011 neurons 1015 connections 8 million km of axons 1 mm2 of a CPU: 1 million transistors 2 connections/transistor (=> 2 million connections) .002 km of wire whole CPU: 109 transistors 2*109 connections 2 km of wire

  34. 1 mm3 of cortex: 50,000 neurons 10000 connections/neuron (=> 500 million connections) 4 km of axons whole brain (2 kg): 1011 neurons 1015 connections 8 million km of axons 1 mm2 of a CPU: 1 million transistors 2 connections/transistor (=> 2 million connections) .002 km of wire whole CPU: 109 transistors 2*109 connections 2 km of wire

  35. dendrites (input) soma (spike generation) axon (output) +20 mV 1 ms voltage -50 mV 100 ms time

  36. synapse current flow

  37. synapse current flow

  38. +20 mV voltage -50 mV 100 ms time

  39. neuron i neuron j neuron j emits a spike: EPSP V on neuron i t 10 ms

  40. neuron i neuron j neuron j emits a spike: IPSP V on neuron i t 10 ms

  41. neuron i neuron j neuron j emits a spike: IPSP V on neuron i t amplitude = wij 10 ms

  42. neuron i neuron j neuron j emits a spike: changes with learning IPSP V on neuron i t amplitude = wij 10 ms

  43. wij current flow

  44. A bigger picture view of the brain

  45. x latent variables peripheral spikes r sensory processing ^ r “direct” code for latent variables cognition memory action selection brain ^ r' “direct” code for motor actions motor processing r' peripheral spikes x' motor actions

  46. Who is walking behind the picket fence?

  47. r

  48. r

  49. r

More Related