The role of background knowledge in sentence processing
This presentation is the property of its rightful owner.
Sponsored Links
1 / 44

The Role of Background Knowledge in Sentence Processing PowerPoint PPT Presentation


  • 94 Views
  • Uploaded on
  • Presentation posted in: General

The Role of Background Knowledge in Sentence Processing. Raluca Budiu July 9, 2001 Thesis Committee: John Anderson, Chair Jaime Carbonell David Plaut Lynne Reder, Department of Psychology. Ambiguity of Language. My mouse behaves erratically lately. -- From an e-mail to CS facilities.

Download Presentation

The Role of Background Knowledge in Sentence Processing

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


The role of background knowledge in sentence processing

The Role of Background Knowledge in Sentence Processing

Raluca Budiu

July 9, 2001

Thesis Committee:

John Anderson, Chair

Jaime Carbonell

David Plaut

Lynne Reder, Department of Psychology


Ambiguity of language

Ambiguity of Language

My mouse behaves erratically lately.

-- From an e-mail to CS facilities

Could you pass me the salt?

That's the sun of the egg.

-- Child speaking about the yolk of a fried egg


Language and noise

Language and Noise

  • Communication channels are noisy

  • People make mistakes

  • We understand how unfair the death penalty is.

  • -- George W. Bush, speaking of death tax

  • Listeners ignore semantic inconsistencies

  • When an aircraft crashes, where should the survivors be buried?


Insight from this research

Insight from this Research

Flexibility: stretching words’

meanings

Reliability: ignoring noise & semantic inconsistencies


The sentence processing model

The Sentence-Processing Model

Sentenceinterpretation

Model

Sentence

Priorknowledge

Noah took two animals of each kind on the ark

Napoleon was defeated at Waterloo in 1815

Plato was Socrate’s student


Main contribution

Main Contribution

A model of language comprehension that:

  • Offers a unified explanation of several complex linguistic phenomena

  • Is incremental (on line)

  • Is as fast as humans

  • Uses prior knowledge and sentence context to understand vague words

  • Is based on the ACT-R theory (Anderson & Lebiere, 1998)


Act r

ACT-R

  • A cognitive architecture based on production systems

  • A rigorous framework for building, running and testing computational models

  • Based on verified assumptions about human cognition (e.g., memory properties, attention)

  • Produces quantitative predictions about human behavior (e.g., accuracy and latency in a task)


Research methodology

Research Methodology

Experiment

predictions

ComputationalACT-R model

Human subjects

Quantitative measures

Quantitative measures

Match?

no


Evaluation of the model

Evaluation of the Model

The model

  • Can comprehend

    • Literal or metaphoric, distorted or undistorted sentences

    • Isolated or in-discourse sentences

  • Can explain patterns of text recall

  • Compares well with people on psycholinguistic experiments

  • Is fast, accurate, and scalable


Outline

Outline

  • Introduction

  • The sentence-processing model

    • Evaluation

  • Comprehension of sentences in discourse

    • Evaluation

  • Scalability

  • Future work and conclusions


The sentence processing model1

The Sentence-Processing Model

Sentence

interpretation

Model

Input sentence

(words + thematic roles)

Background

knowledge

Noah took two animals of each kind on the ark

Napoleon was defeated at Waterloo in 1815

Plato was Socrate’s student


Propositional representation

Propositional Representation

Noah took the animals on the ark

take

Noah

agent

verb

Ark Prop

patient

place-oblique

animals

ark

Parent Ark Prop

Child animals

Type patient


Associations

& Activation

Noah

Associations

Noah is Lamech’s son

Napoleon was defeated at Waterloo

Napoleon

Noah

Patriarch

Noah took the animals on the ark

Moses

take


Search

Noah

Noah is Lamech’s son

Noah

Noah took the animals on the ark

Search

Noah

took the animals on the ark

Noah is Lamech’s son

Napoleon was defeated at Waterloo

Napoleon

Noah

Patriarch

Noah took the animals on the ark

Moses

take


Match

Noah

Noah

Match

Noah

took the animals on the ark

Noah is Lamech’s son

Noah

Noah is Lamech’s son

Napoleon was defeated at Waterloo

Napoleon

Noah

Patriarch

Noah took the animals on the ark

Moses

take


Match1

Noah

took

Noah

take

Match

Noah

took the animals on the ark

Noah is Lamech’s son

is

Noah is Lamech’s son

Napoleon was defeated at Waterloo

Napoleon

Noah

Patriarch

Noah took the animals on the ark

Moses

take


Search1

Noah

Noah

Search

Noah

took

took the animals on the ark

Noah is Lamech’s son

Napoleon was defeated at Waterloo

Napoleon

Noah

Patriarch

Noah took the animals on the ark

Noah took the animals on the ark

Noah took the animals on the ark

Moses

take

take


Final interpretation

Final Interpretation

Noah took the animals on the ark

Noah is Lamech’s son

Napoleon was defeated at Waterloo

Napoleon

Noah

Patriarch

Noah took the animals on the ark

Moses

take


Failures of comprehension

Bug

word offered

role verb

interpretation Lamech Prop

Failures of Comprehension

Noah

offered

burnt offerings on the altar

Lamech Prop

Lamech Prop

No interpretation


Summary of the model

Interpretation?

no

Bug

end of sentence

yes

Search

Integration

Match?

yes

no

Summary of the Model

Read word


Answering true false queries

Answering True/False Queries

  • False = a bug OR no final interpretation found

  • True = no bug AND final interpretation found


Outline1

Outline

  • Introduction

  • The sentence-processing model

    • Empirical evaluation

      • Moses illusion

      • Metaphor-position effects

  • Comprehension of sentences in discourse

    • Evaluation

  • Scalability

  • Future work and conclusions


Moses illusion

Moses Illusion

  • How many animals of each kind did Moses take on the ark?

  • Good vs. bad distortions

  • How many animals of each kind did Adam take on the ark?


Moses illusion data

Moses-Illusion Data

  • Illusion rates for good and bad distortions (Ayers, Reder & Anderson, 1996)

  • Percent correct distortions in the gist task

    (Ayers et al., 1996)

  • Reading times in the literal and gist task (Reder & Kusbit, 1991)


Simulation of moses illusion

Zoo Prop

Zoo Prop

Adam

Zoo Prop

No interpretation

Bug

Simulation of Moses Illusion

Noah

Adam

Moses

take

animals

verb

patient

agent

Ark Prop

Ark Prop

place-oblique

ark

How many animals did Moses take on the ark


Metaphor comprehension

Metaphor Comprehension

  • Effects of position on metaphor understanding (Gerrig & Healy, 1983)

  • Metaphor-familiarity effects

    (Budiu & Anderson, 1999)

  • Understanding metaphoric/literal sentences in context (Budiu & Anderson, 2000)


Metaphor position

Humans

Model

4.21 s

4.30 s

Container Prop

Container Prop

3.53 s

3.68 s

Stars Prop

Stars Prop

Metaphor Position

Drops of molten silver filled

the night sky.

Stars Prop

The night sky was filled with

drops of molten silver.


Outline2

Outline

  • Introduction

  • The sentence-processing model

    • Evaluation

  • Comprehension of sentences in discourse

    • Evaluation

  • Scalability

  • Future work and conclusions


Sentences in discourse

Sentences in Discourse

Create background knowledge from discourse propositions

King Lear’sstory

King Lear decided to divide his kingdom

King Lear had three daughters

Goneril and Regan declare their grand love

Cordelia refuses to make an insincere speech

Cordelia is disinherited

Cordelia marries the king of France


Novel sentences

<end>

Integration

Prop 5

Prop 5

No interpretation

Cordelia is disinherited

word married

Interpretation Prop 5

……

Bug

Prop 5

Novel Sentences

Use a partially matching interpretation to relate to discourse

Cordelia marries the king of France

No interpretation


Outline3

Outline

  • Introduction

  • The sentence-processing model

    • Evaluation

  • Comprehension of sentences in discourse

    • Evaluation

  • Scalability

  • Future work and conclusions


The role of background knowledge in sentence processing

Our

Experiments

Comprehension

Comprehension

(of novel sentences)

shallow

Answer true/false

deep

Metaphor in Discourse

Experiments

Metaphor vs. Literal

Reading Time

Ortony et al., 1978

Inhoff et al., 1984

Shinjo & Myers, 1987

Keysar, 1990

same

Gibbs, 1990

Onishi & Murphy, 1993

slower


Metaphoric sentences in context

Metaphoric Sentences in Context

During history seminars, a massive young man always yawned and never paid any attention to the discussions. He was a very good linebacker who had been all-state in football. The seminar always came after his training sessions, so he was very tired.

True or false:

Read new:

The bear yawned in class

The bear slept quietly

The athlete yawned in class

The athlete slept quietly


Metaphoric sentences in context1

True or false:

Read new:

The bear yawned in class

The bear slept quietly

bear

No interpretation

Find interpretation

Find interpretation

Bug

Bug

Reevaluate bug

Bug-based

integration

The athlete slept quietly <end>

The athlete yawned in class

Interpretation

No interpretation

Metaphoric Sentences in Context


Outline4

Outline

  • Introduction

  • The sentence-processing model

    • Evaluation

  • Comprehension of sentences in discourse

    • Evaluation

  • Scalability

  • Future work and conclusions


Computational constraints

Computational Constraints

  • Speed

  • Accuracy

  • Scalability

    - Word database

    • Sentence database


Scalability test

Scalability Test

  • 436 noun-verb-noun sentences (Brown corpus via PennTreebank project)

  • 999 distinct words

  • One word repeated in at most 9 propositions

  • Associations based on LSA similarity measures (Landauer & Dumais, 1997)

  • Test for comprehension of a known sentence


Model performance

Model Performance


Summary

Summary

  • A model of sentence comprehension with a strong associative mechanism to speed up the search of an interpretation

  • It offers a unified explanation for a variety of empirical psycholinguistic data

  • It is scalable

  • It is implemented in ACT-R


Future work

Future Work

  • Extend the model to other empirical phenomena (e.g., priming, text inference, lexical ambiguity)

  • Identify the ACT-R assumptions that are fundamental

  • Eliminate some of the limitations


Conclusions

Conclusions

  • Context can help the comprehension of metaphoric or semantically-flawed sentences

  • Semantic associations between words are a powerful mechanism that allows fast and flexible comprehension

  • “Peripheral” language phenomena can shedlight on deep cognitive processes


Limitations of the model

Limitations of the Model

  • No syntactic processing

  • Atomic word-phrases (e.g., drops of molten silver)

  • Rudimentary discourse processing

  • Cannot account for sentences containing similar words (e.g., George W.Bush is the son of George Bush)

  • Relationship between discourse and background knowledge

  • Similarities not from ratings

  • No thematic-role cues


The role of background knowledge in sentence processing

Comprehension

shallow

deep

Metaphor in Discourse

Experiments

Metaphor vs. Literal

Reading Time

Ortony et al., 1978

Inhoff et al., 1984

Shinjo & Myers, 1987

Keysar, 1990

same

Gibbs, 1990

Onishi & Murphy, 1993

slower


  • Login