1 / 36

Self organizing maps A visualization technique with data dimension reduction

Self organizing maps A visualization technique with data dimension reduction Juan López González University of Oviedo Inverted CERN School of Computing, 24-25 February 2014. General overview. Lecture 1 Machine learning Introduction Definition Problems Techniques. Lecture 2

senta
Download Presentation

Self organizing maps A visualization technique with data dimension reduction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Self organizing maps A visualization technique with data dimension reduction Juan López González University of Oviedo Inverted CERN School of Computing, 24-25 February 2014

  2. General overview • Lecture 1 • Machine learning • Introduction • Definition • Problems • Techniques • Lecture 2 • ANN introduction • SOM • Simulation • SOM basedmodels

  3. LECTURE 1 Self organizing maps. A visualization technique with data dimension reduction.

  4. 1. Artificial neural networks (ANN)

  5. 1.1. Introduction

  6. 1.2.1. Feedforward NN1.2.2. Recurrent NN1.2.3. Selforganizing NN1.2.4. Others 1.2. Types

  7. 1.2.1. FeedforwardNN

  8. 1.2.1. FeedforwardNN • Single layerfeedforward

  9. 1.2.1. FeedforwardNN • Multi-layerfeedforward • Supervisedlearning • Backpropagationlearningalgorithm

  10. 1.2.2. Recurrent neural networks

  11. 1.2.2. Recurrent neural networks • Elmannetworks • ‘Contextunits’ • Maintainstate

  12. 1.2.2. Recurrent neural networks • Hopefieldnetwork • Symmetricconnections • Associativememory

  13. 1.2.2. Recurrent neural networks • Modular neural networks • The human brainisnot a massivenetworkbut a collection of smallnetworks

  14. 1.2.3. Self-organizingnetworks • Self-organizingnetworks • A set of neuronslearn to mappoints in an input space to coordinates in an output space

  15. 1.2.4. Others • Holographicassociativememory • Instantaneouslytrainednetworks • Learning vector quantization • Neuro-fuzzynetworks • …

  16. 2.1. Motivation2.2. Goal2.3. Mainproperties2.4. Elements2.5. Algorithm 2. Self-organizingmaps

  17. 2.1. Motivation • Topographicmaps • Different sensory inputs (motor, visual, auditory…) are mapped in areas of the cerebral cortex in an orderly fashion

  18. 2.1. Motivation • “The spatial location of an output neuron in a topographic map corresponds to a particular domain or feature drawn from the input space” Auditory cortical fields Motor-somatotopicmaps

  19. 2.2. Goal • Transform incoming signal of arbitrary dimension into a 1-2-3 dimensional discrete map in a topologically ordered fashion

  20. 2.3. Mainproperties • Transform continuos input space to discrete output space • Dimensionreduction • winner-takes-allneuron • Orderedfeaturemap Input with similar characteristics produce similar ouput

  21. 2.3.1 Dimensionreduction • Curse of dimensionality(Richard E. Bellman) • The amount of data needed grows exponentially with the dimensionality • Types • Featureextraction • Reduce input data (features vector) • Featureselection • Selectsubset (removeredundant and irrelevant data)

  22. 2.4. Elements • …of machine learning • A patternexists • Wedon’tknowhow to solveitmathematically • A lot of data • (a1,b1,..,n1), (a2,b2,..,n2) … (aN,bN,..,nN)

  23. 2.4. Elements • Lattice of neurons • Size? • Weights

  24. 2.4. Elements • Learningrate • Neighborhoodfunction • Learningratefunction

  25. 2.5. Algorithm • Initialization • Input data preprocessing • Normalizing • Discrete-continuous variables? • Weightinitialization • Randomweights

  26. 2.5. Algorithm (2) • Sampling • Takesamplefrom input space • Matching • Find BMU: i.e. min of • Updateweights • i.e.

  27. 3. Practicalexercise

  28. 4.1. Digitrecognition4.2. Finishphonetics4.3. Semanticmap of wordcontext 4. Mapexamples

  29. 4.1. Digitrecognition

  30. 4.2. Finnishphonetics

  31. 4.3. Semanticmap of wordcontext

  32. 5.1. TASOM5.2. GSOM5.3. MuSOM 5. Other SOM based models

  33. 5.1. TASOM • Time adaptativeself-organizingmaps • Dealswith non-stationary input distributions • Adaptativelearningrates: n(w,x) • Adaptativeneighborhoodrates: T(w,x)

  34. 5.2. GSOM • Growingself-organizingmaps • DealswithidentifyingsizesforSOMs • Spread factor • New nodes in boundaries • Goodwhenunknownclusters

  35. 5.3. MuSOM • Multimodal SOM • High levelclassificationfromsensoryintegration

  36. Q & A

More Related