1 / 50

SNN Machine learning

SNN Machine learning. Bert Kappen , Luc Janss , Wim Wiegerinck , Vicenc Gomez, Alberto Llera , Mohammad Azar , Bart van den Broek , 2 vacancies Ender Akay , Willem Burgers. Activities. Approximate inference Graphical models Analytical methods Sampling method Control theory

ash
Download Presentation

SNN Machine learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SNN Machine learning Bert Kappen, Luc Janss, WimWiegerinck, Vicenc Gomez, Alberto Llera, Mohammad Azar, Bart van den Broek, 2 vacancies Ender Akay, Willem Burgers

  2. Activities • Approximate inference • Graphical models • Analytical methods • Sampling method • Control theory • Approximate inference • Reinforcement learning • Interaction modeling • Neuroscience • Adaptive BCI • ECoG • Neural networks • Bioinformatics • Genetic linkage analysis • Genome-wide association studies • Missing person identification • Smart Research • Wine portal • Petro-physical expert system • Credit card fraud detection • Promedas • www.promedas.nl/live • UMCU • Promedu

  3. Approximate inference • Control theory • Neural networks • ABCI • ECoG • GWAS • Promedas

  4. Graphical models What are probabilities given evidence: Intractable for large number of variables: 2n for binary variables

  5. Junction tree method Complexity reduction: 2n! 2k (n=8, k=3) State of the art for intermediate size problems No solution for large problems

  6. Approximate inference

  7. Optimal control theory Minimization of cost function: error cost at the end cost to reach the target Optimal solution hard to compute

  8. Optimal control theory Optimal control as a sum over trajectories, Kappen PRL 2005

  9. Linear Bellman equation: efficient computation of optimal controls linear superposition of solutions qualitative different results for high and low noise

  10. Efficient computation Theodorou, Schaal, USC 2009

  11. linear superposition of solutions Da Silva, Popovic, MIT 2009

  12. Qualitative different results for high and low noise • Delayed Choice • Optimal control predicts when to act • More noise means more delay

  13. Results small noise large noise Trajectory Control signal

  14. Modelling neural networks with activity dependent synapses • Dynamic synapses • Recurrent connectivity and Dynamic synapses • Associative memory • dynamical memories • Storage capacity • Sensitivity to external stimuli. • Relation to up-down states and powerlaws • Discussion Marro, Torres, Mejias

  15. a) Electrophysiological preparation in pyramidal neurons (layer 5) for a pairing experiment. b) Pairing: several current pulses (during 200 ms) in the pre and post-synaptic neuron (4-8 action potentials, AP) are injected 30 times each 20 s. c) Before: the response to stimuli is variable and noisy. d) After: optimal response to the first current pulse and there is a decrease of response to the next pulses. e) The effect of “pairing” is robust and Hebbian. Markram and Tsodyks, nature 1996: Dynamic synapses

  16. (a) Intracellular recording in the primary visual cortex of a halothane-anesthetized cat reveals a rhythmic sequence of depolarized and hyperpolarized membrane potentials. (b) Expansion of three of the depolarizing sequences for clarity. (c) Autocorrelogram of the intracellular recording reveals a marked periodicity of about one cycle per three seconds. (d) Simultaneous intracellular and extracellular recordings of the slow oscillation in ferret visual cortical slices maintained in vitro. Note the marked synchrony between the two recordings. The intracellular recording is from a layer 5 intrinsically bursting neuron. The trigger level for the window discriminator of the extracellular multi-unit recording is indicated. (e) The depolarized state at three different membrane potentials. (f) Autocorrelogram of the intracellular recording in (d) shows a marked periodicity of approximately once per 4 seconds. Sanchez-vives, McCormick 2000

  17. Phenomenological model: Tsodyks y Markram (1997)

  18. Attractor neural networks

  19. Associative memory with “static synapses” (Dj=1, Fj=1) • Hopfield network

  20. Oscillations occur for P>1 and more realistic neuron models(Pantic et al, Neural Comp. 2002)

  21. Phase portrait

  22. Storage capacity

  23. Sensitivity to external stimuli: hi+dxim Stimulus grows

  24. Discussion • Synapses show a high variability with a diverse origin: the stochastic opening of the vesicles, variations in the Glutamate concentration through synapses or the spatial heterogeneity of the synaptic response in the dendrite tree (Franks et al. 2003). • Due to synapse dynamics, the neural activity loses stability which increases the sensitivity to changing external stimuli: the concept of dynamical memories • Synaptic depression reduces memory capacity • Synaptic facilitation improves short time memories

  25. Adaptive BCI • A BCI device is called adaptive if it is able to change during performance in order to improve it. • Proposed approach: Use Error Related Potentials to provide the device with feedback about its own performance. Llera, Gomez, van Gerven, Jensen

  26. General idea • Use error related potentials to provide the device with feedback about its own performance. • Are Error Related Potentials possible to classify at the single trial level?

  27. Experimental design • An MEG experiment have been designed to get insight into error related fields in a BCI context. • The protocol has been carefully chosen to avoid lateralization due to movement in the screen. • The protocol is intended to provide us with data containing error related fields and minimal extra input.

  28. Classification methods with best results till now... • Transformation to 28 frequencies in range 3-30 Hz. • Normalization. • 273 channels • 6 time steps of 100 ms • 150 trials per subject • Linear Support Vector Machine

  29. Illustration on toy data • One dimensional feature space. • Two Gaussian distributions, one for each class. • Learning boundary using delta rule each time that we detect an error potential. • Since Error Potentials classification is not 100%, we can have two undesirable effects: • Not learn when we should (prob2). • Learn when we should not (prob1). • Assume that probability of errors is the same for both classes.

  30. We have computed the brain connectivity patterns associated to a simple motor response task from ECoG data recordings: Functional connectivity : Gaussian graphical models Effective connectivity : Direct Transfer Function (DFT), Granger causality. ECoG connectivity patterns during a motor response task Time domain. Provides a symmetric independence matrix. Does not capture time evolution. Assumes normally distributed residuals. Frequency domain. Provides a non-symmetric causal matrix. Does capture time evolution. Assumes a good fit of the MVAR model. Gomez, Ramsey

  31. 104 electrodes (101 after preprocessing). Implanted on the left hemisphere. Two days, 40 trials per condition per day. Sampling Rate 512 Hz: 1792 samples per trial. 22 bits, bandpass filter 0.15 – 134.4 Hz). Inter-electrode distance : 1 cm. ECoG connectivity patterns during a motor response task

  32. The Gaussian model reveals clusters of correlated activity and significant differences between stimulus and response states related with motor areas. ECoG connectivity patterns during a motor response task

  33. ECoG connectivity patterns during a motor response task • With Granger causality we are able to identify a set of source electrodes (red dots) which drive another subset of target electrodes. • Sources are similar in both conditions, although targets differ for stimulus and response conditions.

  34. BayesianVariableSelection: causalmodelingorprediction? • Stochastic search multiple-regression building (usingGibbs sampling algorithmbasedon George & McCulloch 1995). • Efficient in largepproblems (500K predictors) • Extended in a hierarchical model to estimateshrinkage parameter from the data, which we have shown to avoidoverfit. • Model averagingusinghalf-certainassociations was shown to improvepredictionsubstantially: Janss, Franke, Buitelaar

  35. Someextensions / research topics Pathway 2 Pathway 1 • Use of prior information to help (constrain) findinginteractionsbetweenpredictors • Multi-phenotypemodelling and predictionusinganembedded Eigenvector decomposition in a Bayesianhierarchical model. • Multi-layeredvariableselection to model geneticeffectsonbrainfMRI, which models cognitivetasks and psychiatric disorders x1x2x3x4x5x6x7x8x9 … x500000 Interactions selectedwithin pathways y x1x2x3x4x5x6x7x8x9 … x500000 EV latent vectors u1u2u3 y1y2y3y4y5 x1x2x3x4x5x6x7x8x9 … x500000 fMRI data per voxel y1y2y3 Janss, Franke, Buitelaar

  36. PROMEDAS • PRObabilistic • Medical • Diagnostic • Advisory System

  37. Waarom? • Toenemende complexiteit van diagnostiek • Falen in medisch handelen • 98000 patiënten in de VS sterven per jaar als gevolg van falend medisch handelen • Foute diagnose is frequent (8- 40 %) • Toenemende kosten gezondheidszorg • Beschikbaarheid van elektronische data

  38. Input: • patiëntgegevens, klachten, symptomen, labgegevens • Output: • Diagnoses, suggesties voor vervolgonderzoek • Gebruikers: • Huisartsen • Opleiding • Management • Medisch specialisten

  39. Grafische modellen

  40. Grafische modellen

  41. Exponentiele complexiteit

  42. Bomen zijn netwerken zonder lussen De berekening is zeersnelvoorbomen Promedasgraafbenaderenals boom

  43. Message passing • Exact op bomen • Goed op netwerken met weinig lussen • Wordt slechter met • aantal lussen • verbindingssterkte

  44. Medische inhoud • Interne geneeskundevoorspecialisten • 4000 diagnoses, 4000 symptomen, 60000 relaties • informeleklinischeevaluatie • 50 test patients • score of correct diagnoses in top 3 • 6 % all in the top 3 • 26 % two in the top 3 • 54 % one in the top 3 • 14 % not correct • Sindsoktober 2008 geintegreerd in UMCU. Ongeveer 1200 sessies per maand.

  45. Toekomst plannen • Promedas wordt gecommercialiseerd door een nieuw bedrijf Promedas BV. • Mogelijke markten: • Web applicatie of cd-rom • Geïntegreerd in een ziekenhuis informatiesysteem • Telemedicine • ….

  46. Projectteam • Algoritmes & software • SNN, Radboud Universiteit Nijmegen • Medische inhoud • UMC Utrecht www.promedas.nl

  47. Onderwijs N & S voordezerichting • Bachelor • Neural networks and information theory • Neurofysica • Master • Machine Learning • Computational Neuroscience • SNN Colloquia

More Related