1 / 24

From place cells in rats to human syntax: the construction of a cognitive map of grammar

Gideon Borensztajn gideonbor@gmail.com Institute of Phonetic Sciences University of Amsterdam CLS-workshop on Learnability and Computational Models of Language Acquisition, ILLC March 11, 2013. From place cells in rats to human syntax: the construction of a cognitive map of grammar.

hank
Download Presentation

From place cells in rats to human syntax: the construction of a cognitive map of grammar

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Gideon Borensztajngideonbor@gmail.comInstitute of Phonetic SciencesUniversity of AmsterdamCLS-workshop on Learnability and Computational Models of Language Acquisition, ILLCMarch 11, 2013 From place cells in rats to human syntax: the construction of a cognitive map of grammar

  2. What does rat navigation have to do with the human ability to talk? human brain rat brain Well, actually quite a lot.

  3. Place cells and cognitive map theory • Certain cells in rat hippocampus fire only when rat is in specific location in the maze. • Together these place cells encode a mental, or cognitive map of the surrounding environment [O'Keefe and Nadel, 1979]. • Map is constructed gradually from spatial episodic experiences, by linking overlapping spatial cues. spatial sensitivity of different place cells

  4. Function of the cognitive map Cognitive map of structured episodic memories allows the brain to • combine memories flexibly and productively, by making associative jumps between linked memories. • Mentally explore new routes in the maze. • Make transitive inferences (if A > B Λ B > C then A > C) • Similar abilities needed for productive language use • novel sentences build by reusing stored fragments Claim: humans construct a cognitive map of syntactic relations from episodic linguistic experiences

  5. Item-based learning Child language acquisition may shed light on transition from episodic to semantic memory: According to Usage Based Grammar [Tomasello, 2003] children initially memorize and imitate complete utterances (holophrastic stage). Then follows a stage of item-based speech, often organized around so-called verb-islands. Subsequently, children start breaking down the item-based constructions, introducing variables in slots, as in “Where's the X?”, “I wanna X”, etc Local scope of children’s categories (e.g., verb-islands) gradually expands to system-wide scope, while converging to an abstract and adult-like language. Claim: grammar acquisition reflects a process of memory consolidation from episodic to semantic memory Grammaticalization = semantization

  6. Language processing & memory Two kinds of declarative memory, semantic and episodic [Tulving, 1972]: • Episodic memory is a person’s memory of personally experienced events or episodes, embedded in a temporal and spatial context. Me lining up in front of the bakery • Semantic memory is a person’s general world knowledge, including language, in the form of concepts that are systematically related to each other. “bread” • Relation: episodic memories bind • together sequences of items stored in semantic memory • [e.g., Shastri, 2002; Eichenbaum, 2004]

  7. Semantic-episodic :: Rules vs. exemplars • Claim: duality rules versus exemplars arises from semantic-episodic memory interaction during language processing • Episodic memory  item-based nature of language (with a role for concrete constructions (sentence fragments) larger than rules • Semantic memory  abstract, rule-based grammar (Context Free grammar) • The construct-i-con is an instantiation of episodic-semantic memory system for language • How to model this?

  8. S NP NP VP DT NN VP The company NP PP VBD NP PP JJ NN CD NP sold CD NNS IN last year 1,214 cars in DT NNP the U.S. Data oriented parsing • In Data Oriented Parsing (DOP) [e.g., Bod, 1993, 1998] primitive elements of the grammar are subtrees of arbitrary size • They vary in size, form and level of abstraction from complete sentences to abstract rules • Derivation of a sentence in DOP is a sequence of subtrees, combined by substitution operation.

  9. Towards a neural model of language processing and acquisition • Next goal: explain the transition from concrete (imitative) to abstract (productive) language use from a neural perspective, in terms of changes in the representations and organization of the memory system • Propose an explicit model of episodic-semantic interaction in language, where • language processing is modeled as retrievalfrom memory, both episodic (fragments) and semantic (productive rules) • language acquisition is modeled as transition from episodic to semantic linguistic memory ( increasing abstraction of children’s language)

  10. Properties of episodic memory A model of episodic memory must take into account that • All attended episodic experiences leave physical memory tracesin the brain. • Sequentiality: episodes are construed as temporal sequences that bind together static semantic elements, within a certain context [e.g., Eichenbaum, 2004]. • Episodic memories are content addressable: their retrieval can be primedby cues from semantic memory. We will use these properties of the human memory system in designing a computational model of language processing, conceived as retrieval from episodic memory.

  11. S NP VP Episodic grammar • Suppose that the primitive elements of a grammar, corresponding to context-free rewrite rules and words, are stored within treelets in a structured network. • A network of such treelets constitutes a semantic memory, corresponding to a context-free grammar. • (later we will get rid of the labels, and situate treelets in a continuous “substitution space”)

  12. S* VT NP S PP eats eats VP NP NP boy IN START* VP* NP VP NP mango mango boy NP girl VT S* VT VP* NP NP START* NP S S VP eats eats VP NP boy apple apple pear boy VT NP VP NP VP* NP VP NP PP mango mango NP NP IN girl VT VP* NP S NP VP apple apple VT NP VP NP pear Distributed episodic memory of a sentence • Derivation of a sentence is sequence of visits to treelets; describes a path through the network • In the process treelets serially bind to each other, while leaving tracesin the local treelet memories. • Proposal: episodic memory of a sentence consists of physical memory traces distributed across the treelets visited in the derivation. semantic memory (grammar) semantic memory with integrated episodic memory

  13. S* VT S 1-4 1-5 VP* 2-7 1-3 2-3 2-4 2-5 apple eats eats NP VP VP 2-6 1-6 2-1 2-2 NP VT START* START* NP NP S VP 1-7 1-2 1-8 1-1 2-8 1-9 2-9 1-10 2-10 girl boy boy girl NP VT VP NP VP* NP NP mango apple mango Episodic traces Treelets after processing “boy eats mango” (orange), and “girl eats apple” (blue). Traces are encoded as x-y x = #sentence in corpus; y = #position in derivation (top-down or left-corner)  implements pointers.

  14. CH=4 1-4 1-5 CH=3 CH=1 CH=2 eats VT 1-6 2-6 2-7 2-8 VP S NP 1-3 bear CH=1 CH=2 NP VT eats S* NP VP 1-7 2-9 CH=3 VP* VP* apple mango Parsing as a priming effect • When parsing a novel sentence, the traces in a visited treelet are primed, and trigger memories of stored sentences (content addressability). • The traces (ex) receive an activation value (A), of which the strength depends on the common history (CH) of the pending derivation (d) with the stored derivation (x) • CH is given as #derivation steps shared between d and x. • Based on this one can define a probabilistic model Activation

  15. Episodic left corner parsing Based on probabilistic left corner chart parser of [van Uytsel, 2001] and [Stolcke, 1995] Episodic probabilities for Pshift, Pproj and Patt are no longer estimated in advance, but computed on-the-fly from trace activations Rules  Treelets (=rules containing traces) States  Treelet states q, which are of the form q = {G; X ← jλ • iμ ; Eq} Eq is set of traces (stored in treelet) with activations The activation, or CH, of a trace ej in state q is updated using dynamic programming. Borensztajn, G. & Zuidema, W. (2011), Episodic grammar: a computational model of the interaction between episodic and semantic memory in language processing. Proc. CogSci 2011

  16. Results of episodic left corner parser on syntactic parsing task * For this work, results are on section 22 of WSJ, sentences <=20 ** No smoothing, results do not generalize to entire section

  17. From supervised to unsupervised • In episodic grammar (labeled) treelets were given innately (copied from treebank, supervised). • What we really want is to learn abstract rules of grammar from (episodic) linguistic experience alone (unsupervised). • Suppose we had a large repository of blanktreelets (without labels) • The labels of the treelets are replaced by vectors within a high-dimensional `substitution space’ • Grammar acquisition then amounts to construction of cognitive map of syntactic relations, through topological self-organization. Borensztajn, G., Zuidema, W. & Bod, R. (2009), The hierarchical prediction network: towards a neural theory of grammar acquisition. Proc. CogSci 2009 

  18. NP No labels, no fixed associations with non-terminals Det Adj N Hierarchical temporal compression: simplified model of MPF Prototypical, graded categories NP VP NP VP Continuous category space S NP NP Det Adj N VP Pointers stored in local memories Dynamic, serial binding The Hierarchical Prediction Network (HPN)From theory to model Features:

  19. A cognitive map of syntactic relations: the HPN “substitution space” Simple, lexical units complex units X1 under VP, verb eat NP, noun PP, prep X2 tomato the X3 happy A node’s position in a continuous substitution spacedefines its graded membership to one or more syntactic categories. Substitutabilityis given as the topological distance in substitution space, and it is learned gradually. Conventional syntactic categories correspond to regions in this space.

  20. Learning a grammar from episodes in HPN • Work in progress: • Learning a grammar in HPN amounts to self-organization of the network topology, driven by episodic experience (i.e., the shortest derivation): • Fast, one-shot learning of exemplars by storing traces (episodic memory). • Slow learning of a topology, by adjusting vectors of treelets bound within derivation (semantic memory). • Episodic component prefers parses that are compatible with seen exemplars, which in turn affects the topological organization of the semantic memory.  Grammar acquisition is construed as memory consolidation from episodic to semantic memory. • Demonstrates gradual `decontextualization’ of episodic memories (traces) into abstract grammar (topology).

  21. Dual role of the hippocampus • ‘Cognitive maps’ (place cells) situated in hippocampus • Hippocampus functions as a `gateway‘ for episodic memories • Involved in flexible and productive use of memory, needed for novel problem solving (Eichenbaum et al., 1990) • Also involved in binding in language (e.g., Opitz, 2010) • Common explanation? both episodic reconstruction and on-line processing involve flexible, dynamic binding • Allows efficient storage of episodes (w/o massive binding) • Allows systematic and productive use of language • Hippocampus implements a switchboardfunction

  22. The hippocampus implements a “switchboard” function complex units (‘treelets’) subunits switchboard buffer serial transmission of stored address lexical units boy walks feeds the who Borensztajn, G., Zuidema, W. & Bechtel, W. (in press). Systematicity and the need for encapsulated representations. In Calvo, P. and Symons, J., (eds.) Systematicity and Cognitive Architecture (MIT Press).

  23. Conclusions • Episodic grammar is a promising framework that can bridge the fields of computational linguistics and cognitive science. • Explanatory value: gives neural perspective on the trade-off between rule-based and exemplar-based models of language processing. • Offers quantitative evaluation of an original hypothesis about the relation between semantic and episodic memory, and offers account of memory consolidation. • New, cognitively inspired computational approach to unsupervised grammar induction as gradual transition between episodic and semantic memory (consolidation), but still much work to be done.

  24. Thank you! • Questions? email: gideonbor@gmail.com webpage: staff.science.uva.nl/~gideon

More Related