Parsers and grammars
Download
1 / 173

Parsers and Grammars - PowerPoint PPT Presentation


  • 145 Views
  • Updated On :

Parsers and Grammars. Colin Phillips. Outline. The Standard History of Psycholinguistics Parsing and rewrite rules Initial optimism Disappointment and the DTC Emergence of independent psycholinguistics Reevaluating relations between competence and performance systems. Standard View. 324

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Parsers and Grammars' - richard_edik


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Parsers and grammars l.jpg

Parsers and Grammars

Colin Phillips


Outline l.jpg
Outline

  • The Standard History of Psycholinguistics

  • Parsing and rewrite rules

  • Initial optimism

  • Disappointment and the DTC

  • Emergence of independent psycholinguistics

  • Reevaluating relations between competence and performance systems


Standard view l.jpg
Standard View

324

697+

?

217 x 32 = ?

arithmetic


Standard view4 l.jpg
Standard View

specialized algorithm

specialized algorithm

324

697+

?

217 x 32 = ?

arithmetic


Standard view5 l.jpg
Standard View

specialized algorithm

specialized algorithm

324

697+

?

217 x 32 = ?

?

arithmetic

something deeper


Standard view6 l.jpg
Standard View

specialized algorithm

specialized algorithm

understanding

speaking

grammaticalknowledge,competence

language

recursive characterization ofwell-formed expressions


Standard view7 l.jpg
Standard View

specialized algorithm

specialized algorithm

understanding

speaking

precisebut ill-adapted toreal-time operation

grammaticalknowledge,competence

language

recursive characterization ofwell-formed expressions


Standard view8 l.jpg
Standard View

specialized algorithm

specialized algorithm

understanding

speaking

well-adapted toreal-time operationbut maybe inaccurate

grammaticalknowledge,competence

language

recursive characterization ofwell-formed expressions


Grammatical knowledge l.jpg
Grammatical Knowledge

  • How is grammatical knowledge accessed in syntactic computation for...(a) grammaticality judgment(b) understanding(c) speaking

  • Almost no proposals under standard view

  • This presents a serious obstacle to unification at the level of syntactic computation


Townsend bever 2001 ch 2 l.jpg
Townsend & Bever (2001, ch. 2)

  • “Linguists made a firm point of insisting that, at most, a grammar was a model of competence - that is, what the speaker knows. This was contrasted with effects of performance, actual systems of language behaviors such as speaking and understanding. Part of the motive for this distinction was the observation that sentences can be intuitively ‘grammatical’ while being difficult to understand, and conversely.”


Townsend bever 2001 ch 212 l.jpg
Townsend & Bever (2001, ch. 2)

  • “…Despite this distinction the syntactic model had great appeal as a model of the processes we carry out when we talk and listen. It was tempting to postulate that the theory of what we know is a theory of what we do, thus answering two questions simultaneously.1. What do we know when we know a language?2. What do we do when we use what we know?


Townsend bever 2001 ch 213 l.jpg
Townsend & Bever (2001, ch. 2)

  • “…It was assumed that this knowledge is linked to behavior in such a way that every syntactic operation corresponds to a psychological process. The hypothesis linking language behavior and knowledge was that they are identical.


Miller 1962 l.jpg
Miller (1962)

1. Mary hit Mark. K(ernel)2. Mary did not hit Mark. N3. Mark was hit by Mary. P4. Did Mary hit Mark? Q5. Mark was not hit by Mary. NP6. Didn’t Mary hit Mark? NQ7. Was Mark hit by Mary? PQ8. Wasn’t Mark hit by Mary? PNQ


Miller 196215 l.jpg
Miller (1962)

Transformational Cube


Townsend bever 2001 ch 216 l.jpg
Townsend & Bever (2001, ch. 2)

  • “The initial results were breathtaking. The amount of time it takes to produce a sentence, given another variant of it, is a function of the distance between them on the sentence cube. (Miller & McKean 1964).”“…It is hard to convey how exciting these developments were. It appeared that there was to be a continuing direct connection between linguistic and psychological research. […] The golden age had arrived.”


Townsend bever 2001 ch 217 l.jpg
Townsend & Bever (2001, ch. 2)

  • “Alas, it soon became clear that either the linking hypothesis was wrong, or the grammar was wrong, or both.”


Townsend bever 2001 ch 218 l.jpg
Townsend & Bever (2001, ch. 2)

  • “The moral of this experience is clear. Cognitive science made progress by separating the question of what people understand and say from how they understand and say it. The straightforward attempt to use the grammatical model directly as a processing model failed. The question of what humans know about language is not only distinct from how children learn it, it is distinct from how adults use it.”


A simple derivation l.jpg
A Simple Derivation

S (starting axiom)

S


A simple derivation20 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP

S

NP

VP


A simple derivation21 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP

S

NP

VP

V

NP


A simple derivation22 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP3. NP D N

S

NP

VP

V

NP

D

N


A simple derivation23 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP3. NP D N4. N Bill

S

NP

VP

Bill

V

NP

D

N


A simple derivation24 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP3. NP D N4. N Bill5. V hit

S

NP

VP

Bill

V

NP

hit

D

N


A simple derivation25 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the

S

NP

VP

Bill

V

NP

hit

D

N

the


A simple derivation26 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball

S

NP

VP

Bill

V

NP

hit

D

N

the

ball


A simple derivation27 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball

S

NP

VP

Bill

V

NP

hit

D

N

the

ball



A simple derivation29 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball

Bill


A simple derivation30 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball

NP

Bill


A simple derivation31 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball

NP

Bill

hit


A simple derivation32 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball

NP

Bill

V

hit


A simple derivation33 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball

NP

Bill

V

hit

the


A simple derivation34 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball

NP

Bill

V

hit

D

the


A simple derivation35 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball

NP

Bill

V

hit

D

the

ball


A simple derivation36 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball

NP

Bill

V

hit

D

N

the

ball


A simple derivation37 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball

NP

Bill

V

NP

hit

D

N

the

ball


A simple derivation38 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball

NP

VP

Bill

V

NP

hit

D

N

the

ball


A simple derivation39 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball

S

NP

VP

Bill

V

NP

hit

D

N

the

ball


A simple derivation40 l.jpg
A Simple Derivation

S (starting axiom)1. S  NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball

S

NP

VP

Bill

V

NP

hit

D

N

the

ball


Transformations l.jpg
Transformations

wh-movement

X wh-NP Y

1 2 3

--> 2 1 0 3


Transformations42 l.jpg
Transformations

VP-ellipsis

X VP1 Y VP2 Z

1 2 3 4 5

--> 1 2 3 0 5

condition: VP1 = VP2


Difficulties l.jpg
Difficulties

  • How to build structure incrementally in right-branching structures

  • How to recognize output of transformations that create nulls


Summary l.jpg
Summary

  • Running the grammar ‘backwards’ is not so straightforward - problems of indeterminacy and incrementality

  • Disappointment in empirical tests of Derivational Theory of Complexity

  • Unable to account for processing of local ambiguities


Standard view45 l.jpg
Standard View

specialized algorithm

specialized algorithm

understanding

speaking

grammaticalknowledge,competence

language

recursive characterization ofwell-formed expressions


Grammatical knowledge46 l.jpg
Grammatical Knowledge

  • How is grammatical knowledge accessed in syntactic computation for...(a) grammaticality judgment(b) understanding(c) speaking

  • Almost no proposals under standard view

  • This presents a serious obstacle to unification at the level of syntactic computation


Arguments for architecture l.jpg
Arguments for Architecture

1. Available grammars don’t make good parsing devices

2. Grammaticality ≠ Parsability

3. Failure of DTC

4. Evidence for parser-specific structure

5. Parsing/production have distinct properties

6. Possibility of independent damage to parsing/production

7. Competence/performance distinction is necessary, right?


Arguments for architecture49 l.jpg
Arguments for Architecture

1. Available grammars don’t make good parsing devices

2. Grammaticality ≠ Parsability

3. Failure of DTC

4. Evidence for parser-specific structure

5. Parsing/production have distinct properties

6. Possibility of independent damage to parsing/production

7. Competence/performance distinction is necessary, right?


Grammar as parser problems l.jpg
Grammar as Parser - Problems

  • Incremental structure building with PS Rules (e.g. S -> NP VP)

    • delay

    • prediction/guessing

  • Indeterminacy ( how to recover nulls created by transformations)


Grammar as parser solutions l.jpg
Grammar as Parser - Solutions

  • Lexicalized grammars make incremental structure-building much easier (available in HPSG, minimalism, LFG, Categorial Grammar, etc. etc.)

VP

VP -> V PPPP -> P NP

V

PP

sat

NP

P

on

the rug


Grammar as parser solutions52 l.jpg
Grammar as Parser - Solutions

  • Lexicalized grammars make incremental structure-building much easier (available in HPSG, minimalism, LFG, Categorial Grammar, etc. etc.)

VP

sitcomp: __ Poncomp: __ N

V

PP

sat

NP

P

on

the rug


Grammar as parser solutions53 l.jpg
Grammar as Parser - Solutions

  • Problem of seeking nulls in movement structures


Transformations54 l.jpg
Transformations

wh-movement

X wh-NP Y

1 2 3

--> 2 1 0 3


Transformations55 l.jpg
Transformations

VP-ellipsis

X VP1 Y VP2 Z

1 2 3 4 5

--> 1 2 3 0 5

condition: VP1 = VP2


Grammar as parser solutions56 l.jpg
Grammar as Parser - Solutions

  • Problem of seeking nulls in movement structures

  • …becomes problem of seeking licensing features for displaced phrases, e.g. for wh-phrase, seek Case assigner and thematic role assigner.

  • Requirement to find licensing features is a basic component of all syntactic composition


Incremental structure building l.jpg
Incremental Structure Building

  • An investigation of the grammatical consequences of incremental, left-to-right structure building









Incremental structure building65 l.jpg
Incremental Structure Building

A

B

C

constituent is destroyed by addition of new material




Incremental structure building68 l.jpg
Incremental Structure Building

A

B

C

D

constituent is destroyed by addition of new material





Slide72 l.jpg

Incremental Structure Building

the cat

sat

on

the rug



Slide74 l.jpg

Incremental Structure Building

the cat

sat

on

the rug


Slide75 l.jpg

Incremental Structure Building

the cat

sat

[sat on] is a temporary constituent, which is destroyed as soon as the NP [the rug] is added.

on

the rug


Incremental structure building76 l.jpg
Incremental Structure Building

Conflicting Constituency Tests

Verb + Preposition sequences can undergo coordination…

(1) The cat sat on and slept under the rug.

…but cannot undergo pseudogapping (Baltin & Postal, 1996)

(2) *The cat sat on the rug and the dog did the chair.



Slide78 l.jpg

Incremental Structure Building

the cat

and

sat

on

slept

under


Slide79 l.jpg

Incremental Structure Building

the cat

coordination applies early, before the V+P constituent is destroyed.

and

sat

on

slept

under



Slide81 l.jpg

Incremental Structure Building

the cat

sat

on

the rug


Slide82 l.jpg

Incremental Structure Building

the cat

and

the dog

did

sat

on

the rug


Slide83 l.jpg

Incremental Structure Building

the cat

and

the dog

did

sat

pseudogapping applies too late, after the V+P constituent is destroyed.

on

the rug


Incremental structure building84 l.jpg
Incremental Structure Building

  • Constituency ProblemDifferent diagnostics of constituency frequently yield conflicting results

  • Incrementality Hypothesis(a) Structures are assembled strictly incrementally(b) Syntactic processes see a ‘snapshot’ of a derivation - they target constituents that are present when the process applies(c) Conflicts reflect the simple fact that different processes have different linear properties

  • Applied to interactions among binding, movement, ellipsis, prosodic phrasing, clitic placement, islands, etc. (Phillips 1996, in press; Richards 1999, 2000; Guimaraes 1999; etc.)


Interim conclusion l.jpg
Interim Conclusion

  • Grammatical derivations look strikingly like the incremental derivations of a parsing system

  • But we want to be explicit about this, so...


Computational modeling l.jpg
Computational Modeling

(Schneider 1999; Schneider & Phillips, 1999)


Arguments for architecture88 l.jpg
Arguments for Architecture

1. Available grammars don’t make good parsing devices

2. Grammaticality ≠ Parsability

3. Failure of DTC

4. Evidence for parser-specific structure

5. Parsing/production have distinct properties

6. Possibility of independent damage to parsing/production

7. Competence/performance distinction is necessary, right?


Townsend bever 2001 ch 289 l.jpg
Townsend & Bever (2001, ch. 2)

  • “Linguists made a firm point of insisting that, at most, a grammar was a model of competence - that is, what the speaker knows. This was contrasted with effects of performance, actual systems of language behaviors such as speaking and understanding. Part of the motive for this distinction was the observation that sentences can be intuitively ‘grammatical’ while being difficult to understand, and conversely.”


Grammaticality parsability l.jpg
Grammaticality ≠ Parsability

  • “It is straightforward enough to show that sentence parsing and grammaticality judgments are different. There are sentences which are easy to parse but ungrammatical (e.g. that-trace effects), and there are sentences which are extremely difficult to parse, but which may be judged grammatical given appropriate time for reflection (e.g. multiply center embedded sentences). This classic argument shows that parsing and grammar are not identical, but it tells us very little about just how much they have in common.”

    (Phillips, 1995)


Grammaticality parsability91 l.jpg
Grammaticality ≠ Parsability

  • Grammatical sentences that are hard to parse

    • The cat the dog the rat bit chased fled

    • John gave the man the dog bit a sandwich

  • Ungrammatical sentences that are understandable

    • Who do you think that left?

    • The children is happy

    • The millionaire donated the museum a painting


Grammaticality parsability92 l.jpg
Grammaticality ≠ Parsability

  • Grammatical sentences that are hard to parse

    • The cat the dog the rat bit chased fled

    • John gave the man the dog bit a sandwich

  • Can arise independent of grammar

    • Resource (memory) limitations

    • Incorrect choices in ambiguity


Preliminary l.jpg
(Preliminary)

  • Incomplete structural dependencies have a cost (that’s what yields center embedding)


Slide94 l.jpg

A Contrast

(Gibson 1998)

  • Relative Clause within a Sentential Complement (RC SC):

  • The fact [CP that the employee [RC who the manager hired] stole office supplies] worried the executive.

  • Sentential Complement within aRelative Clause (SC  RC):

  • #The executive [RC who the fact [CP that the employee stole office supplies] worried] hired the manager.

  • RC  SC is easier to process than SC  RC


Slide95 l.jpg

A Contrast

(Gibson 1998)

  • Relative Clause within a Sentential Complement (RC SC):

  • [SC that the employee [RC who the manager hired] stole

  • Sentential Complement within aRelative Clause (SC  RC):

  • [RC who the fact [SC that the employee stole office supplies] worried]

  • RC  SC is easier to process than SC  RC


Slide96 l.jpg

A Contrast

(Gibson 1998)

  • Relative Clause within a Sentential Complement (RC SC):

  • [SC that the employee [RC who the manager hired] stole

  • Sentential Complement within aRelative Clause (SC  RC):

  • [RC who the fact [SC that the employee stole office supplies] worried]

  • RC  SC is easier to process than SC  RC


Slide97 l.jpg

A Contrast

(Gibson 1998)

  • Relative Clause within a Sentential Complement (RC SC):

  • [SC that the employee [RC who the manager hired] stole

  • Sentential Complement within aRelative Clause (SC  RC):

  • [RC who the fact [SC that the employee stole office supplies] worried]

  • Contrast is motivated by off-line complexity ratings


Grammaticality parsability98 l.jpg
Grammaticality ≠ Parsability

  • Ungrammatical sentences that are understandable

    • Who do you think that left?

    • The children is happy

    • The millionaire donated the museum a painting

  • System can represent illegal combinations (e.g. categories are appropriate, but feature values are inappropriate)

  • Fact that understandable errors are (i) diagnosable, (ii) nearly grammatical, should not be overlooked


Grammaticality parsability99 l.jpg
Grammaticality ≠ Parsability

  • Are the parser’s operations fully grammatically accurate?


Standard view100 l.jpg
Standard View

specialized algorithm

specialized algorithm

understanding

speaking

well-adapted toreal-time operationbut maybe inaccurate

grammaticalknowledge,competence

language

recursive characterization ofwell-formed expressions


Slide101 l.jpg

Grammatical Accuracy in Parsing

  • The grammar looks rather like a parser

  • BUT, does the parser look like a grammar?i.e., are the parser’s operations fully grammatically accurate at every step… even in situations where such accuracy appears quite difficult to achieve

(Phillips & Wong 2000)


Self paced reading l.jpg
Self-Paced Reading

-- --- ------- ------- ---- --- ----.

(e.g. Phillips & Wong 2000)


Self paced reading103 l.jpg
Self-Paced Reading

We --- ------- ------- ---- --- ----.

(e.g. Phillips & Wong 2000)


Self paced reading104 l.jpg
Self-Paced Reading

-- can ------- ------- ---- --- ----.

(e.g. Phillips & Wong 2000)


Self paced reading105 l.jpg
Self-Paced Reading

-- --- measure ------- ---- --- ----.

(e.g. Phillips & Wong 2000)


Self paced reading106 l.jpg
Self-Paced Reading

-- --- ------- reading ---- --- ----.

(e.g. Phillips & Wong 2000)


Self paced reading107 l.jpg
Self-Paced Reading

-- --- ------- ------- time --- ----.

(e.g. Phillips & Wong 2000)


Self paced reading108 l.jpg
Self-Paced Reading

-- --- ------- ------- ---- per ----.

(e.g. Phillips & Wong 2000)


Self paced reading109 l.jpg
Self-Paced Reading

-- --- ------- ------- ---- --- word.

(e.g. Phillips & Wong 2000)


Slide110 l.jpg

Grammatical Accuracy in Parsing

Wh-Questions

(Phillips & Wong 2000)


Slide111 l.jpg

Grammatical Accuracy in Parsing

Wh-Questions

Englishmen cook wonderful dinners.

(Phillips & Wong 2000)


Slide112 l.jpg

Grammatical Accuracy in Parsing

Wh-Questions

Englishmen cook wonderful dinners.

(Phillips & Wong 2000)


Slide113 l.jpg

Grammatical Accuracy in Parsing

Wh-Questions

Englishmen cook what

(Phillips & Wong 2000)


Slide114 l.jpg

Grammatical Accuracy in Parsing

Wh-Questions

Englishmen cook what

(Phillips & Wong 2000)


Slide115 l.jpg

Grammatical Accuracy in Parsing

Wh-Questions

What do Englishmen cook

(Phillips & Wong 2000)


Slide116 l.jpg

Grammatical Accuracy in Parsing

Wh-Questions

What do Englishmen cook gap

(Phillips & Wong 2000)


Slide117 l.jpg

Grammatical Accuracy in Parsing

Wh-Questions

What do Englishmen cook gap

(Phillips & Wong 2000)


Slide118 l.jpg

Grammatical Accuracy in Parsing

Long-distance Wh-Questions

Few people think that anybody realizes that Englishmen cook wonderful dinners

(Phillips & Wong 2000)


Slide119 l.jpg

Grammatical Accuracy in Parsing

Long-distance Wh-Questions

Few people think that anybody realizes that Englishmen cook what

(Phillips & Wong 2000)


Slide120 l.jpg

Grammatical Accuracy in Parsing

Long-distance Wh-Questions

What do few people think that anybody realizes

that Englishmen cook gap

(Phillips & Wong 2000)


Slide121 l.jpg

Grammatical Accuracy in Parsing

‘Parasitic Gaps’

The plan to remove the equipment ultimately destroyed the building.

(Phillips & Wong 2000)


Slide122 l.jpg

Grammatical Accuracy in Parsing

‘Parasitic Gaps’

The plan to remove the equipment ultimately destroyed the building.

Direct Object NP

Direct Object NP

(Phillips & Wong 2000)


Slide123 l.jpg

Grammatical Accuracy in Parsing

‘Parasitic Gaps’

The plan to remove the equipment ultimately destroyed the building.

Direct Object NP

Direct Object NP

Main Clause

(Phillips & Wong 2000)


Slide124 l.jpg

Grammatical Accuracy in Parsing

‘Parasitic Gaps’

Subject NP

The plan to remove the equipment ultimately destroyed the building.

Direct Object NP

Direct Object NP

Embedded Clause

Main Clause

(Phillips & Wong 2000)


Slide125 l.jpg

Grammatical Accuracy in Parsing

‘Parasitic Gaps’

What did the plan to remove the equipment ultimately destroy

(Phillips & Wong 2000)


Slide126 l.jpg

Grammatical Accuracy in Parsing

‘Parasitic Gaps’

What did the plan to remove the equipment ultimately destroy gap

(Phillips & Wong 2000)


Slide127 l.jpg

Grammatical Accuracy in Parsing

‘Parasitic Gaps’

What did the plan to remove ultimately destroy the building

(Phillips & Wong 2000)


Slide128 l.jpg

Grammatical Accuracy in Parsing

‘Parasitic Gaps’

What did the plan to remove gap ultimately destroy the building

(Phillips & Wong 2000)


Slide129 l.jpg

Grammatical Accuracy in Parsing

‘Parasitic Gaps’

Subject

What did the plan to remove gap ultimately destroy the building

Island Constraint

A wh-phrase cannot be moved out of a subject.

(Phillips & Wong 2000)


Slide130 l.jpg

Grammatical Accuracy in Parsing

‘Parasitic Gaps’

What did the plan to remove ultimately destroy

(Phillips & Wong 2000)


Slide131 l.jpg

Grammatical Accuracy in Parsing

‘Parasitic Gaps’

What did the plan to remove ultimately destroy

(Phillips & Wong 2000)


Slide132 l.jpg

Grammatical Accuracy in Parsing

‘Parasitic Gaps’

What did the plan to remove ultimately destroy

Parasitic Gap

Generalization: the good gap ‘rescues’ the bad gap


Slide133 l.jpg

Grammatical Accuracy in Parsing

‘Parasitic Gaps’

Infinitive

What did the plan to remove ultimately destroy

Generalization: the good gap ‘rescues’ the bad gap


Slide134 l.jpg

Grammatical Accuracy in Parsing

‘Parasitic Gaps’

Finite

What did the plan that removed ultimately destroy

Revised Generalization (informal)

Only mildly bad gaps can be rescued by good gaps.


Grammaticality ratings l.jpg
Grammaticality Ratings

Ratings from 50 subjects


Slide136 l.jpg

Grammatical Accuracy in Parsing

A ‘Look-Ahead’ Problem

Infinitive

What did the plan to remove ultimately destroy

The good gap rescues the bad gap

BUT

The bad gap appears before the good gap … a look-ahead problem


Slide137 l.jpg

Grammatical Accuracy in Parsing

A ‘Look-Ahead’ Problem

Infinitive

What did the plan to remove ultimately destroy

Embedded Verb

Question

When the parser reaches the embedded verb, does it construct a dependency - even though the gap would be a ‘bad’ gap?


Slide138 l.jpg

Grammatical Accuracy in Parsing

A ‘Look-Ahead’ Problem

Infinitive

What did the plan to remove ultimately destroy

Risky

Finite

What did the plan that removed ultimately destroy

Reckless


Slide139 l.jpg

Grammatical Accuracy in Parsing

Question

What do speakers do when they get to the verb embedded inside the subject NP?

(i) RISKY: create a gap in infinitival clauses only - violates a constraint, but may be rescued

(ii) RECKLESS: create a gap in all clause types - violates a constraint; cannot be rescued

(iii) CONSERVATIVE: do not create a gap

(Phillips & Wong 2000)


Slide140 l.jpg

Grammatical Accuracy in Parsing

Materials

a. …what … infinitival verb ... [infinitive, gap ok]

b. … whether ..infinitival verb ... [infinitive, no gap]

c. … what … finite verb... [finite, gap not ok]

d. … whether … finite verb ... [finite, no gap]

(Phillips & Wong 2000)


Slide141 l.jpg

Grammatical Accuracy in Parsing

Materials

a. …what … infinitival verb ... [infinitive, gap ok]

b. … whether ..infinitival verb ... [infinitive, no gap]

c. … what … finite verb... [finite, gap not ok]

d. … whether … finite verb ... [finite, no gap]

Gap here: RISKY

(Phillips & Wong 2000)


Slide142 l.jpg

Grammatical Accuracy in Parsing

Materials

a. …what … infinitival verb ... [infinitive, gap ok]

b. … whether ..infinitival verb ... [infinitive, no gap]

c. … what … finite verb... [finite, gap not ok]

d. … whether … finite verb ... [finite, no gap]

Gap here: RECKLESS

(Phillips & Wong 2000)


Slide143 l.jpg

Grammatical Accuracy in Parsing

Materials

a. The outspoken environmentalist worked to investigate what the local campaign to preserve the important habitats had actually harmed in the area that the birds once used as a place for resting while flying south. [infinitive, gap]

b. …whether the local campaign to preserve… [infinitive, no gap]

c. …what the local campaign that preserved… [finite, gap]

d. …whether the local campaign that preserved … [finite, no gap]

(Phillips & Wong 2000)


Slide144 l.jpg

Grammatical Accuracy in Parsing

Infinitive

What did the plan to remove ultimately destroy

(Phillips & Wong 2000)

Risky


Slide145 l.jpg

Grammatical Accuracy in Parsing

Finite

What did the plan that removed ultimately destroy

(Phillips & Wong 2000)

Reckless


Slide146 l.jpg

Grammatical Accuracy in Parsing

Conclusion

  • Structure-building is extremely grammatically accurate, even when the word-order of a language is not cooperative

  • Constraints on movement are violated in exactly the environments where the grammar allows the violation to be forgiven (may help to explain discrepancies in past studies)

  • Such accuracy is required if grammatical computation is to be understood as real-time on-line computation


Arguments for architecture148 l.jpg
Arguments for Architecture

1. Available grammars don’t make good parsing devices

2. Grammaticality ≠ Parsability

3. Failure of DTC

4. Evidence for parser-specific structure

5. Parsing/production have distinct properties

6. Possibility of independent damage to parsing/production

7. Competence/performance distinction is necessary, right?


Derivational theory of complexity l.jpg
Derivational Theory of Complexity

  • ‘The psychological plausibility of a transformational model of the language user would be strengthened, of course, if it could be shown that our performance on tasks requiring an appreciation of the structure of transformed sentences is some function of the nature, number and complexity of the grammatical transformations involved.’ (Miller & Chomsky 1963: p. 481)


Miller 1962150 l.jpg
Miller (1962)

1. Mary hit Mark. K(ernel)2. Mary did not hit Mark. N3. Mark was hit by Mary. P4. Did Mary hit Mark? Q5. Mark was not hit by Mary. NP6. Didn’t Mary hit Mark? NQ7. Was Mark hit by Mary? PQ8. Wasn’t Mark hit by Mary? PNQ


Miller 1962151 l.jpg
Miller (1962)

Transformational Cube


Derivational theory of complexity152 l.jpg
Derivational Theory of Complexity

  • Miller & McKean (1964): Matching sentences with the same meaning or ‘kernel’

  • Joe warned the old woman. KThe old woman was warned by Joe. P 1.65s

  • Joe warned the old woman. KJoe didn’t warn the old woman. N 1.40s

  • Joe warned the old woman. KThe old woman wasn’t warned by Joe. PN 3.12s


Mcmahon 1963 l.jpg
McMahon (1963)

a. i. seven precedes thirteen K (true)

ii. thirteen precedes seven K (false)

b. i. thirteen is preceded by seven P (true)

ii. seven is preceded by thirteen P (false)

c. i. thirteen does not precede seven N (true)

ii. seven does not precede thirteen N (false)

d. i. seven is not preceded by thirteen PN (true)

ii. thirteen is not preceded by seven PN (false)


Easy transformations l.jpg
Easy Transformations

  • Passive

    • The first shot the tired soldier the mosquito bit fired missed.

    • The first shot fired by the tired soldier bitten by the mosquito missed.

  • Heavy NP Shift

    • I gave a complete set of the annotated works of H.H. Munro to Felix.

    • I gave to Felix a complete set of the annotated works of H.H. Munro.

  • Full Passives

    • Fido was kissed (by Tom).

  • Adjectives

    • The {red house/house which is red} is on fire.


Failure of dtc l.jpg
Failure of DTC?

  • Any DTC-like prediction is contingent on a particular theory of grammar, which may be wrong

  • It’s not surprising that transformations are not the only contributor to perceptual complexity

    • memory demands, may increase or decrease

    • ambiguity, where grammar does not help

    • difficulty of access


Arguments for architecture157 l.jpg
Arguments for Architecture

1. Available grammars don’t make good parsing devices

2. Grammaticality ≠ Parsability

3. Failure of DTC

4. Evidence for parser-specific structure

5. Parsing/production have distinct properties

6. Possibility of independent damage to parsing/production

7. Competence/performance distinction is necessary, right?


Garden paths temporary ambiguity l.jpg
Garden Paths & Temporary Ambiguity

  • The horse raced past the barn fell.

  • Weapons test scores a hit.

  • John gave the man the dog bit a sandwich.

  • Grammar can account for the existence of global ambiguities (e.g. ‘Visiting relatives can be boring’), but not local ambiguities … since the grammar does not typically assemble structure incrementally


  • Garden paths temporary ambiguity159 l.jpg
    Garden Paths & Temporary Ambiguity

    • Ambiguity originally studied as test of solution to the incrementality problem

    • Heuristics & Strategies (e.g. Bever, 1970)

      • NP V => subject verb

      • V NP => verb object

      • V NP NP => verb object object

    • Garden paths used as evidence for effects of heuristics


    Garden paths temporary ambiguity160 l.jpg
    Garden Paths & Temporary Ambiguity

    • Heuristics & Strategies

      • NP V => subject verbThe horse raced past the barn fell

      • V NP => verb objectThe student knew the answer was wrong

      • V NP NP => verb object objectJohn gave the man the dog bit a sandwich


    Ambiguity resolution l.jpg
    Ambiguity Resolution

    • Observation: ‘heuristics’ miss a generalization about how ambiguities are preferentially resolved

    • Kimball (1973): Seven principles of surface structure parsing (e.g. Right Association)

    • Frazier (1978), Fodor & Frazier (1978): Minimal Attachment, Late Closure

    • Various others, much controversy...


    Ambiguity resolution162 l.jpg
    Ambiguity Resolution

    • Assumptions

      • grammatical parses are accessed (unclear how)

      • simplest analysis of ambiguity chosen (uncontroversial)

      • structural complexity affects simplicity (partly controversial)

      • structural complexity determines simplicity (most controversial)


    Ambiguity resolution163 l.jpg
    Ambiguity Resolution

    • Relevance to architecture of language

      • Comprehension-specific heuristics which compensate for inadequacy of grammar imply independent system

      • Comprehension-specific notions of structural complexity compatible with independent system

    • If grammar says nothing about ambiguity, and structural complexity is irrelevant to ambiguity resolution, as some argue, then ambiguity is irrelevant to question of parser-grammar relations.


    Arguments for architecture165 l.jpg
    Arguments for Architecture

    1. Available grammars don’t make good parsing devices

    2. Grammaticality ≠ Parsability

    3. Failure of DTC

    4. Evidence for parser-specific structure

    5. Parsing/production have distinct properties

    6. Possibility of independent damage to parsing/production

    7. Competence/performance distinction is necessary, right?


    Parsing production l.jpg
    Parsing ≠ Production

    • Parsing generates meaning from form

    • Production generates form from meaning

    • Different ‘bottlenecks’ in the two areas

      • garden paths in comprehension

      • word-category constraint in production errors

      • etc., etc.

    • Lexical access: speaking and recognizing words differs, but do we assume that this reflects different systems?

    • Contemporary production theories are now incremental structure-building systems, more similar to comprehension models


    Arguments for architecture168 l.jpg
    Arguments for Architecture

    1. Available grammars don’t make good parsing devices

    2. Grammaticality ≠ Parsability

    3. Failure of DTC

    4. Evidence for parser-specific structure

    5. Parsing/production have distinct properties

    6. Possibility of independent damage to parsing/production

    7. Competence/performance distinction is necessary, right?


    Competence performance l.jpg
    Competence & Performance

    • Different kinds of formal systems: ‘Competence systems’ and ‘Performance systems’

    • The difference between what a system can generate given unbounded resources, and what it can generate given bounded resources

    • The difference between a cognitive system and its behavior


    Competence performance170 l.jpg
    Competence & Performance

    (1) It’s impossible to deny the distinction between cognitive states and actions, the distinction between knowledge and its deployment.

    (2) How to distinguish ungrammatical-but-comprehensible examples (e.g. John speaks fluently English) from hard-to-parse examples.

    (3) How to distinguish garden-path sentences (e.g. The horse raced past the barn fell) from ungrammatical sentences.

    (4) How to distinguish complexity overload sentences (e.g. The cat the dog the rat chased saw fled) from ungrammatical sentences.


    Competence performance171 l.jpg
    Competence & Performance

    “It is straightforward enough to show that sentence parsing and grammaticality judgments are different. There are sentences which are easy to parse but ungrammatical (e.g. that-trace effects), and there are sentences which are extremely difficult to parse, but which may be judged grammatical given appropriate time for reflection (e.g. multiply center embedded sentences). This classic argument shows that parsing and grammar are not identical, but it tells us very little about just how much they have in common.”

    (Phillips, 1995)

    This argument is spurious!


    Summary173 l.jpg
    Summary

    • Motivation for combining learning theories with theories of adult knowledge is well-understood; much more evidence needed.

    • Theories of comprehension and production long thought to be independent of ‘competence’ models. In fact, combination of these is quite feasible; if true, possible to investigate linguistic knowledge in real time.


    ad