1 / 53

The Neural Basis of Thought and Language

The Neural Basis of Thought and Language. Final Review Session. Administrivia. Final in class next Tuesday, May 9 th Be there on time! Format: closed books, closed notes short answers, no blue books And then you’re done with the course!. Motor Control. Grammar. Metaphor. Bayes Nets.

keelie-hays
Download Presentation

The Neural Basis of Thought and Language

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Neural Basis ofThought and Language Final Review Session

  2. Administrivia • Final in class next Tuesday, May 9th • Be there on time! • Format: • closed books, closed notes • short answers, no blue books • And then you’re done with the course!

  3. Motor Control Grammar Metaphor Bayes Nets Bailey Model KARMA ECG Bayesian Model of HSP SHRUTI The Second Half Cognition and Language Computation Structured Connectionism abstraction Computational Neurobiology Biology Midterm Final

  4. Bailey Model feature structures Bayesian model merging recruitment learning KARMA X-schema, frames aspect event-structure metaphor inference Grammar Learning parsing construction grammar learning algorithm SHRUTI FrameNet Bayesian Model of Human Sentence Processing Overview

  5. Neural System & Development Motor Control & Visual System Metaphor Psycholinguistics Experiments Spatial Relation Grammar Verbs & Spatial Relation Full Circle Embodied Representation Structured Connectionism Probabilistic algorithms ConvergingConstraints

  6. Q & A

  7. How can we capture the difference between “Harry walked into the cafe.” “Harry is walking into the cafe.” “Harry walked into the wall.”

  8. “Harry walked into the café.” Utterance Constructions Analysis Process Semantic Specification General Knowledge Belief State Simulation

  9. The INTO construction construction INTO subcase of Spatial-Relation form selff .orth ← “into” meaning: Trajector-Landmark evokes Container as cont evokes Source-Path-Goal as spg trajector ↔ spg.trajector landmark ↔ cont cont.interior ↔ spg.goal cont.exterior ↔ spg.source

  10. The Spatial-Phrase construction construction SPATIAL-PHRASE constructional constituents sr : Spatial-Relation lm : Ref-Expr form srfbefore lmf meaning srm.landmark ↔ lmm

  11. The Directed-Motion construction construction DIRECTED-MOTION constructional constituents a : Ref-Exp m: Motion-Verb p : Spatial-Phrase form afbefore mf mfbefore pf meaning evokes Directed-Motion as dm selfm.scene ↔ dm dm.agent ↔ am dm.motion ↔ mm dm.path ↔ pm schema Directed-Motion roles agent : Entity motion : Motion path : SPG

  12. at goal start finish ready ongoing done time of day hungry meeting iterate cafe WALK What exactly is simulation? • Belief update plus X-schema execution

  13. goal=cafe walker=Harry “Harry walked into the café.” walk ready done

  14. “Harry is walking to the café.” Utterance Constructions Analysis Process Semantic Specification General Knowledge Belief State Simulation

  15. suspended interrupt resume start finish ready ongoing done abort iterate cancelled goal=cafe walker=Harry “Harry is walking to the café.” WALK

  16. “Harry has walked into the wall.” Utterance Constructions Analysis Process Semantic Specification General Knowledge Belief State Simulation

  17. Perhaps a different sense of INTO? construction INTO subcase of spatial-prep form selff .orth ← “into” meaning evokes Trajector-Landmark as tl evokes Container as cont evokes Source-Path-Goal as spg tl.trajector ↔ spg.trajector tl.landmark ↔ cont cont.interior ↔ spg.goal cont.exterior ↔ spg.source construction INTO subcase of spatial-prep form selff .orth ← “into” meaning evokes Trajector-Landmark as tl evokes Impact as im evokes Source-Path-Goal as spg tl.trajector ↔ spg.trajector tl.landmark ↔ spg.goal im.obj1 ↔ tl.trajector im.obj2 ↔ tl.landmark

  18. suspended interrupt resume start finish ready ongoing done abort iterate cancelled goal=wall walker=Harry “Harry has walked into the wall.” WALK

  19. start finish done ready ongoing S E R Map down to timeline consequence

  20. further questions?

  21. What about… “Harry walked into trouble” or for stronger emphasis, “Harry walked into trouble, eyes wide open.”

  22. Metaphors • metaphors are mappings from a source domain to a target domain • metaphor maps specify the correlation between source domain entities / relation and target domain entities / relation • they also allow inference to transfer from source domain to target domain (possibly, but less frequently, vice versa) <TARGET> is <SOURCE>

  23. Event Structure Metaphor • Target Domain: event structure • Source Domain: physical space • States are Locations • Changes are Movements • Causes are Forces • Causation is Forced Movement • Actions are Self-propelled Movements • Purposes are Destinations • Means are Paths • Difficulties are Impediments to Motion • External Events are Large, Moving Objects • Long-term Purposeful Activities are Journeys

  24. KARMA • DBN to represent target domain knowledge • Metaphor maps link target and source domain • X-schema to represent source domain knowledge

  25. Metaphor Maps • map entities and objects between embodied and abstract domains • invariantly map the aspect of the embodied domain event onto the target domain by setting the evidence for the status variable based on controller state (event structure metaphor) • project x-schema parameters onto the target domain

  26. further questions?

  27. How do you learn… the meanings of spatial relations, the meanings of verbs, the metaphors, and the constructions?

  28. How do you learn… the meanings of spatial relations, the meanings of verbs, the metaphors, and the constructions? That’s the Regier model. (first half of semester)

  29. How do you learn… the meanings of spatial relations, the meanings of verbs, the metaphors, and the constructions? VerbLearn

  30. data #1 data #2 data #3 data #4

  31. wants the best model given data how likely is the data given this model? penalize complex models – those with too many word senses Computational Details • complexity of model + ability to explain data • maximum a posteriori (MAP) hypothesis

  32. How do you learn… the meanings of spatial relations, the meanings of verbs, the metaphors, and the constructions? conflation hypothesis (primary metaphors)

  33. How do you learn… the meanings of spatial relations, the meanings of verbs, the metaphors, and the constructions? construction learning

  34. (Utterance, Situation) Constructions (Comm. Intent, Situation) Generate Analyze Utterance Analysis Comprehension Production Usage-based Language Learning Reorganize Hypothesize Partial Acquisition

  35. Main Learning Loop while <utterance, situation> available and cost > stoppingCriterion analysis = analyzeAndResolve(utterance, situation, currentGrammar); newCxns = hypothesize(analysis); if cost(currentGrammar + newCxns) < cost(currentGrammar) addNewCxns(newCxns); if (re-oganize == true) // frequency depends on learning parameter reorganizeCxns();

  36. THROW < BALL THROW < OBJECT THROW < BALL < OFF Three ways to get new constructions • Relational mapping • throw the ball • Merging • throw the block • throwing the ball • Composing • throw the ball • ball off • you throw the ball off

  37. Minimum Description Length • Choose grammar G to minimize cost(G|D): • cost(G|D) = α • size(G) + β • complexity(D|G) • Approximates Bayesian learning; cost(G|D) ≈ posterior probability P(G|D) • Size of grammar = size(G) ≈ 1/prior P(G) • favor fewer/smaller constructions/roles; isomorphic mappings • Complexity of data given grammar ≈ 1/likelihood P(D|G) • favor simpler analyses(fewer, more likely constructions) • based on derivation length + score of derivation

  38. further questions?

  39. Connectionist Representation How can entities and relations be represented at the structured connectionist level? or How can we represent Harry walked to the café in a connectionist model?

  40. SHRUTI • entity, type, and predicate focal clusters • An “entity” is a phase in the rhythmic activity. • Bindings are synchronous firings of role and entity cells • Rules are interconnection patterns mediated by coincidence detector circuits that allow selective propagation of activity • An episode of reflexive processing is a transient propagation of rhythmic activity

  41. Harry + ? cafe +e +v ?e ?v walk ? agt goal + - “Harry walked to the café.” • asserting that walk(Harry, café) • Harry fires in phase with agent role • cafe fires in phase with goal role entity type predicate

  42. Harry + ? cafe +e +v ?e ?v walk ? agt goal + - “Harry walked to the café.” • asserting that walk(Harry, café) • Harry fires in phase with agent role • cafe fires in phase with goal role entity type predicate

  43. +: walk walk-agt walk-goal +: Harry +e: cafe 1 2 3 4 Activation Trace for walk(Harry, café)

  44. further questions?

  45. Human Sentence Processing Can we use any of the mechanisms we just discussed to predict reaction time / behavior when human subjects read sentences?

  46. Good and Bad News • Bad news: • No, not as it is. • ECG, the analysis process and simulation process are represented at a higher computational level of abstraction than human sentence processing (lacks timing information, requirement on cognitive capacity, etc) • Good news: • we can construct bayesian model of human sentence processing behavior borrowing the same insights

  47. Bayesian Model of Sentence Processing • Do you wait for sentence boundaries to interpret the meaning of a sentence? No! • As words come in, we construct • partial meaning representation • some candidate interpretations if ambiguous • expectation for the next words • Model • Probability of each interpretation given words seen • Stochastic CFGs, N-Grams, Lexical valence probabilities

  48. Main Verb S S NP VP NP VP NP VP D N VBN D N VBD PP The cop arrested the detective The cop arrested by SCFG + N-gram Reduced Relative Stochastic CFG

  49. S S NP VP NP VP NP VP D N VBN D N VBD PP The cop arrested the detective The cop arrested by SCFG + N-gram Reduced Relative Main Verb N-Gram

More Related