1 / 88

Announcements

Announcements. Sample exam questions This week (Thursday): You will submit your Qs into dropbox Bring Completed Homework For next class: sentence completion survey given to friends. Psy1302 Psychology of Language. Lecture 11 & 12 Sentence Comprehension II. Models of Sentence Processing.

von
Download Presentation

Announcements

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Announcements • Sample exam questions • This week (Thursday): You will submit your Qs into dropbox • Bring Completed Homework • For next class: sentence completion survey given to friends.

  2. Psy1302 Psychology of Language Lecture 11 & 12Sentence Comprehension II

  3. Models of Sentence Processing • Garden-Path Model • Autonomous • Late closure • Minimal attachment • Constraint-Based Model • Interactive • Lexical Biases • Referential Contexts • Structural Biases } Cues from multiple sources constrain interpretation

  4. Traditional Views(contrasting lexical and syntactic ambiguity) Constraint-Satisfaction Model SAYS it’s not the right characterization! Table from MacDonald, Pearlmutter, & Seidenberg Paper

  5. Experiment to test the 2 models(Tanenhaus, Spivey-Knowlton, Eberhard, & Sedivy, 1995) Method: Eye-Tracking During Listening

  6. Setting-Up the Experiment: RC Group 1 Put the frog on the napkin in the box.

  7. Setting-Up the Experiment: RC Group 2 Put the frog that is on the napkin in the box.

  8. Setting-Up the Experiment: RC Which group was garden-pathed? • Group 1: Put the frog on the napkin in the box. • Group 2: Put the frog that is on the napkin in the box.

  9. Setting-Up the Experiment: RC What is a Relative Clause? • Relative Clause: a subordinate clause that modifies the noun • Group 1: Put the frog on the napkin in the box. • Group 2: Put the frogthat ison the napkin in the box. REDUCED RELATIVE CLAUSE,AMBIGUOUSAT “ON” NON-REDUCED RELATIVE CLAUSE,UNAMBIGUOUSAT “ON”

  10. How does the model explain the difficulty of parsing: Put the frog on the napkin in the box. The sentence processed using these 2 simple rules: Late Closure & Minimal Attachment Sometimes these simple rules lead leads one down the incorrect path, and a reanalysis is necessary. Setting-Up the Experiment: Garden-Path Garden-Path Model

  11. Setting-Up the Experiment: Garden-Path Late Closure When possible, attach incoming lexical items into the clause or phrase currently being processed (i.e., the lowest possible nonterminal node dominating the last item analyzed). Minimal Attachment Attach incoming lexical items into the phrase-marker being constructed with the fewest nodes consistent with well-formedness rules of language. VP V NP PP frog Det N P put the on Where to attach? VP-attachment put the frog on… VP or NP? NP-attachment

  12. VP attachment Setting-Up the Experiment: Garden-Path VP V NP PP Det N P NP put the frog on Det N the napkin VP V NP PP put NP Det NP N P … the frog on Det N the napkin 2 Attachments & 2 Meanings • NP attachment PP phrase as modifier of “frog” PP phrase as destination of “put” the frog (that is) on the napkin… on(to) the napkin the frog put

  13. Setting-Up the Experiment: Garden-Path Late Closure When possible, attach incoming lexical items into the clause or phrase currently being processed (i.e., the lowest possible nonterminal node dominating the last item analyzed). Minimal Attachment Attach incoming lexical items into the phrase-marker being constructed with the fewest nodes consistent with well-formedness rules of language. VP V NP PP frog Det N P put the on Where to attach? VP-attachment 1. CANNOT attach directly to NP: NP  Det N PP IF attach to NP: NP  NP PP  Violates Minimal Attachment! 2. Attach to VP: VP  V NP PP  Does NOT violate either rules! VP or NP? NP-attachment

  14. Setting-Up the Experiment: Garden-Path put the frog on… 1. Syntactic processor first VP-attaches for “on” put the frog on the napkin in… 2. When encountering the 2nd prep “in” of “in the box”, parser does not know how to incorporate the word. 3. Reanalysis is needed due to incorrect first parse  longer processing time. Garden-Path Model • How does the model explain the difficulty of parsing: Put the frog on the napkin in the box. • Answer:

  15. How does the model explain the difficulty of parsing: Put the frog on the napkin in the box. Constraint-Satisfaction Model uses information from multiple sources to constrain interpretation In this case the lexical and contextual information likely does not support the interpretation or favors another one. Setting-Up the Experiment: Garden-Path Constraint-Satisfaction Model

  16. How does the model explain the difficulty of parsing: Put the frog on the napkin in the box. BIG Q: What kinds of information can be used to constrain interpretation? Examples: Lexical Biases Referential Context Setting-Up the Experiment: Constraint-Satisfaction Model Constraint-Satisfaction Model

  17. Setting-Up the Experiment: Constraint-Satisfaction Model Constraint-Satisfaction Model Lexical Biases • Type of syntactic/semantic environments in which a word appears Example: • “Put” almost always appears with a VP attached PP (destination) • “Put the car in the garage” • “Choose” rarely does so • “Choose the car in the garage”

  18. Setting-Up the Experiment: Constraint-Satisfaction Model Constraint-Satisfaction Model • How does the model explain the difficulty of parsing: Put the frog on the napkin in the box. • Lexical Biases Support VP-attachment • “Put” almost always appears with a VP attached PP (destination) • “on” is a locative preposition • “on the napkin” is a location • i.e., compatible with possibility of a destination required by “put”

  19. Referential Context Pick a frog. Which frog did you pick? Modifiers pick out a member of a set When 2+ referents, modifiers help differentiate the referent in question Setting-Up the Experiment: Constraint-Satisfaction Model Constraint-Satisfaction Model

  20. Experiment to test the 2 models(Tanenhaus, Spivey-Knowlton, Eberhard, & Sedivy, 1995) • FINALLY, the experiment!!! • Do we consider the referential context in parsing? • More specifically, WHEN do we consider referential parsing? http://www.ircs.upenn.edu/Trueswellabs/video.html

  21. Put the frog on the napkin in the box. • Do we consider the referential context in parsing? • More specifically, WHEN do we consider referential parsing? 1-Referent: 1 frog 2-Referents: 2 frogs OR

  22. Put the frog on the napkin in the box. • Do we consider the referential context in parsing? • More specifically, WHEN do we consider referential parsing? NAPKIN is a potential destination. 1-Referent: 1 frog 2-Referents: 2 frogs OR

  23. Put the frog on the napkin in the box. • Do we consider the referential context in parsing? • More specifically, WHEN do we consider referential parsing? BY GARDEN-PATH MODEL: Regardless of 1 or 2 referent, during the first pass, NAPKIN is considered as a destination. 1-Referent: 1 frog 2-Referents: 2 frogs OR

  24. Put the frog on the napkin in the box. BY CONSTRAINT-SATISFACTION MODEL (which takes into consideration of referential context early): For 1 referent, NAPKIN is considered as a destination For 2 referent, NAPKIN could potentially be a modifier of FROG, and NOT a destination • Do we consider the referential context in parsing? • More specifically, WHEN do we consider referential parsing? 1-Referent: 1 frog 2-Referents: 2 frogs OR

  25. (2-Referents) (2-Referents) (1-Referent) (1-Referent) Tanenhaus, Spivey-Knowlton, Eberhard, & Sedivy (1995) Method: Eye-Tracking During Listening AMBIGUOUS SENTENCE HEARD: Put the frog on the napkin… into the box. UNAMBIGUOUS SENTENCE HEARD: Put the frog that is on the napkin… into the box.

  26. PUT THE FROG ON THE NAPKIN IN THE BOX. CORRECT DESTINATION INCORRECT DESTINATION - Reduced Relative - Unreduced Relative “that is”

  27. 4 1 2 3 Typical Eye-movement for the Ambiguous Sentences 1-Referent: 1 frog 2-Referents: 2 frogs Put the frog on the napkin in the box. 3 4 1 2

  28. 4 B 1 A 2 3 Typical Eye-movement for the Ambiguous Sentences 1-Referent: 1 frog 2-Referents: 2 frogs Put the frog on the napkin in the box. 3 4 1 2 A B

  29. Constraint-Satisfaction Model • Highly Interactive • Limited Parallel Processing • If all information converge on a single analysis, then serial • If they do not, then several may be maintained

  30. How are cues combined?(Interactive Activation Unfolding in Time) Noun Arg Structure (prob. of PP) e.g., frog Verb Argument Structure (prob. of PP) e.g., put, choose Preposition prob. of NP vs. VP e.g., of, on PP NP-Attached PP VP-Attached Referential Context or

  31. Verb Argument Structure (prob. of PP) e.g., put, choose Noun Arg Structure (prob. of PP) e.g., frog Preposition prob. of NP vs. VP e.g., of, on VP-Attachment NP-Attachment Referential Context or How are cues combined?(Interactive Activation Unfolding in Time) • Selection of VP- vs. NP-attachment • Put the frog on… • When with: • 1 referent • 2 referent

  32. How are cues combined?(Interactive Activation Unfolding in Time) Noun Arg Structure (prob. of PP) e.g., frog Preposition prob. of NP vs. VP e.g., of, on PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context or

  33. How are cues combined?(Interactive Activation Unfolding in Time) Noun Arg Structure (prob. of PP) e.g., frog Preposition prob. of NP vs. VP e.g., of, on PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context or

  34. How are cues combined?(Interactive Activation Unfolding in Time) Preposition prob. of NP vs. VP e.g., of, on FROG (N): No bias PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context or

  35. How are cues combined?(Interactive Activation Unfolding in Time) ON (P): 95% NP-Attach 5% VP-Attach. FROG (N): No bias PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context or

  36. How are cues combined?(Interactive Activation Unfolding in Time) ON (P): 95% NP-Attach 5% VP-Attach. FROG (N): No bias PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context or

  37. How are cues combined?(Interactive Activation Unfolding in Time) ON (P): 95% NP-Attach 5% VP-Attach. FROG (N): No bias PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context or

  38. How are cues combined?(Interactive Activation Unfolding in Time) ON (P): 95% NP-Attach 5% VP-Attach. FROG (N): No bias PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context 1-referent

  39. How are cues combined?(Interactive Activation Unfolding in Time) ON (P): 95% NP-Attach 5% VP-Attach. FROG (N): No bias PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context 1-referent

  40. How are cues combined?(Interactive Activation Unfolding in Time) ON (P): 95% NP-Attach 5% VP-Attach. FROG (N): No bias PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context 2-referents

  41. How are cues combined?(Interactive Activation Unfolding in Time) ON (P): 95% NP-Attach 5% VP-Attach. FROG (N): No bias PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context 2-referents

  42. Break!

  43. Moving on to Assigned Readings • Garden Path Model vs. Constraint Satisfaction Model • Ferreira & Clifton (1986) • Trueswell, Tanenhaus, & Garnsey (1994)

  44. Subtext • These experiments test hypotheses • What was being tested? • What was found? • Multiple experiments • How did each experiment replicate or extend previous findings? • How did each experiment support or refute previous findings?

  45. Outline • Stats Terms Simplified • t-tests • ANOVAs, Main effects and Interactions • Regressions, Correlations • Assigned Papers

  46. T-tests and ANOVAs • T-tests: Compare 2 means. • ANOVA (Analysis of Variance): Compare multiple means • Yields significance of main or interaction effects

  47. Hypothetical Experiment(Example of Main & Interactions Effects) • Dependent Measure: Number of Girlfriends • Independent Measure: • Wealth of bachelors according to Income • (Rich, Poor) • Looks of same bachelors according to Oprah • (Handsome, Ugly)

  48. Design 2 x 2 # of GF # of GF # of GF # of GF

  49. Many Few Many Few Many Most Many Many Few Many Few Few Few Many Least Few handsome handsome #GFs ugly #GFs ugly Rich Poor Rich Poor handsome handsome #GFs ugly #GFs ugly Rich Poor Rich Poor

  50. Hypothetical Experiment(Example of ANOVAs F1 vs. F2) • Is a female model more attractive in short or long skirt? • Model pictured in 10 different short skirts and 10 different long skirts • 30 Males rated the model’s attractiveness in each skirt (1 = not attractive to 7 = extremely attractive)

More Related