1 / 1

Current –mode WTA circuit (Signal – current) Local competition Local competitions on network

Local winner. X. l2. …. X. l1. winner. l3. …. local winner Branches logically cut off: l1 l3 Signal on goes to. …. …. Primary level h+1. Set of post-synaptic neurons of N 4 level. j. 4. 2. level+1. 3. Increasing number of Overall neurons. 1. 5. 4. 6. 7.

travis
Download Presentation

Current –mode WTA circuit (Signal – current) Local competition Local competitions on network

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Local winner X l2 … X l1 winner l3 … local winner Branches logically cut off: l1 l3 Signal on goes to … … Primary level h+1 Set of post-synaptic neurons of N4level j 4 2 level+1 3 Increasing number of Overall neurons 1 5 4 6 7 Attention Face Attention House 8 9 … … Secondary level s 1 5 7 level 2 3 4 6 … i Primary level h Set of pre-synaptic neurons of N4level+1 Input pattern Dual Neurons N Nface Nface Nhouse N Sparse representation How to find it? N … ………… Nhouse …... … Hierarchical levels Input layer Attention aided perception in sparse-coding networks Janusz A. Starzyk, Yinyin Liu Ohio University, Athens, OH SPARSE CODING AND SPARSE STRUCTURE OLIGARCHY-TAKES-ALL (OTA) ATTENTION-AIDED PERCEPTION • Top-down process for attention-aided perception • local competition and feedback • Cortical learning: unsupervised learning • Finding sensory input activation pathway • A small group of neuron become active on the top level representing an object • “Grandmother cell” by J. V. Lettvin – only one neuron on the top level representing and recognizing an object (extreme case) • Current –mode WTA circuit (Signal – current) • Local competition • Local competitions on network • The visual attention: cognitive control over perception and representation building • Object-based Attention: when several objects are in the visual scene simultaneously, the attention helps recognizing the attended object. • One candidate competition mechanism: the top-down feedback signal that synchronizes the activity of target neurons that represent the attended object. • A similar mechanism can also be applied to invariant representationbuilding through continuous observation of various patterns of the same object. C. Connor, “Friends and grandmothers’, Nature, Vol. 435, June, 2005 Produce sparse neural representation —“sparse coding” • 1012 neurons in human brain are sparsely connected • On average, each neuron is connected to other neurons through about 104 synapses • Sparse structure enables efficient computation and saves energy and cost in implementing a memory in hardware N4level+1 is the winner among 4,5,6  N4level+1  N4level • Oligarchy-takes-all algorithm http://parasol.tamu.edu/groups /amatogroup/research/NeuronPRM/ • Signal goes through layer by layer • Local WTA competition is done on each layer • Multiple local winner neurons on each level • Multiple winner neurons on the top level – oligarchy-takes-all (OTA) • Oligarchy is the obtained sparse representation • Provides coding redundancy and robustness • Increases representational memory capacity Sparse coding in sparsely-connected network HIERARCHICAL SELF-ORGANIZING MEMORY • Hierarchical learning network: • Use secondary neurons to provide “full connectivity” in sparse structure • More secondary levels can increase the sparsity • Primary levels and secondary levels • Finding sensory input activation pathway • Competition is needed: Finding neurons with stronger activities and suppress the ones with weaker activities • Have to use multiple layers to transmit enough information and try to provide “full connectivity” in sparse structure

More Related