1 / 16

word

word. A simple semantic network (O’Kane & Treves, 1992). updated to remove the ‘memory glass’ problem (Fulvi Mari & Treves, 1998). Reduced to a Potts model (Kropff & Treves, 2005). Cortical modules. Structured long-range connectivity. Potts units with dilute connectivity.

mary
Download Presentation

word

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. word

  2. A simple semantic network (O’Kane & Treves, 1992) updated to remove the ‘memory glass’ problem (Fulvi Mari & Treves, 1998) Reduced to a Potts model (Kropff & Treves, 2005) Cortical modules Structured long-range connectivity Potts units with dilute connectivity ..but all cortical modules share the same organization… Local attractor states “0” state included S+1 Potts states Global activity patterns Sparse global patterns Sparse Potts patterns pc  S ?!?! pc  CS 2 !!

  3. Simulations which include a model of neuronal fatigue show that the Potts semantic network can hop from global attractor to global attractor: Latching dynamics Simulations which include a model of neuronal fatigue Simulations

  4. Hauser, Chomsky & Fitch

  5. Latching dynamics, if transition probabilities are structured, might be a neural model for infinite recursion

  6. Monkey recordings by Moshe Abeles et al

  7. +L +L How might have a capacity for indefinite latching evolved? p p semantics  semantics  AM AM C S long-range conn  (local conn ) Storage capacity (max p to allow cued retrieval) pc  CS 2 Latching onset (min p to ensure recursive process) pl S ? a spontaneous transition to infinite recursion?

  8. G Elston et al

  9. rand determ determ rand + we need to confirm the crucial quantitative relationships, e.g. that in a multi-factor coding model (with correlated patterns) pc  CS 2? pl S ? Latching may be a neural basis for infinite recursion only if transition probabilities are structured, so that dynamics are neither random not deterministic Emilio Kropff has taken care of that (J Nat Comput, 2006)

  10. Computer simulations of Frontal Latching Networks with N = 300 Potts units a = 0.25 sparse coding S = 3,4,5,7,10 + 1 states C = 12,17,25,50,100 connections p = 25-400 patterns generated by 20 relevant factors How to quantify retrieval ? and latching ?

  11. Retrieval and latching appear to coexist only above critical values of both C and S Is that to FLNs a percolation phase transition?

More Related