ling 408 508 computational techniques for linguists n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
LING 408/508: Computational Techniques for Linguists PowerPoint Presentation
Download Presentation
LING 408/508: Computational Techniques for Linguists

Loading in 2 Seconds...

play fullscreen
1 / 32

LING 408/508: Computational Techniques for Linguists - PowerPoint PPT Presentation


  • 138 Views
  • Uploaded on

LING 408/508: Computational Techniques for Linguists. Lecture 27 10/29/2012. Outline. Cocke - Kasami -Younger (CKY) algorithm Backpointers for CKY Bottom-up parsing Structural ambiguity Short assignment #17. CKY demo. http://www.diotavelli.net/people/void/demos/cky.html

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'LING 408/508: Computational Techniques for Linguists' - marva


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
outline
Outline
  • Cocke-Kasami-Younger (CKY) algorithm
  • Backpointers for CKY
  • Bottom-up parsing
  • Structural ambiguity
  • Short assignment #17
cky demo
CKY demo
  • http://www.diotavelli.net/people/void/demos/cky.html
  • (Orientation of table is different than shown in class)
pseudocode for cky
Pseudocode for CKY

1. Base case: lower diagonal of table

For all i, add A  wi to table[i, i]

2. Inductive case: rest of table

for w = 2 to N:

for i = 0 to N-w:

for k = 1 to w-1:

if:

A  B C and

B   table[i, i+k] and

C   table[i+k, i+w]

then:

add A  B C to table[i, i+w]

3. If S  table[0, N], return True

N = length of input sentence

w = width of constituent we are looking for

i = starting location of possible constituent

k = size of split point between two sub-constituents

3 loops, O(N3)

return value
Return value
  • 3. If S  table[0, N], return True
  • The algorithm as presented so far only tells us that there exists a successful parse for the entire sentence.
  • Want to return the parse tree (next section)
outline1
Outline
  • Cocke-Kasami-Younger (CKY) algorithm
  • Backpointers for CKY
  • Bottom-up parsing
  • Structural ambiguity
  • Short assignment #17
backpointers
Backpointers
  • Each cell will store a dictionary, where:
    • A key is a constituent (string)
      • Nonterminals such as N, DT, NP, VP, S, etc.
      • Can have multiple keys because word sequences can be ambiguous for constituent category
    • A value is a list of tuples
      • Tuple: consists of a pair of tuples of 2 integers, that indicate the constituents that combine to form a higher-level constituent
      • List of tuples: multiple ways to combine two constituents to form a higher-level constituent
backpointers example 1
Backpointers: example 1
  • S  NP VP
  • There is a NP in cell (0, 1) and a VP in cell (1, 5). Therefore, there is an S in cell (0, 5).
  • Cell (0, 5) stores: { 'S': [ ((0, 1), (1, 5)) ] }
backpointers example 2 see next slide for illustration
Backpointers: example 2(see next slide for illustration)
  • VP  V NP | VP PP
  • There is a V in cell (1, 2) and a NP in cell (2, 7). Therefore, there is a VP in cell (1, 7).
  • There is a VP in cell (1, 4) and a PP in cell (4, 7). Therefore, there is a VP in cell (1, 7).
  • Cell (1, 7) stores: { 'VP': [ ((1, 2), (2, 7)),

((1, 4), (4, 7)) ] }

slide10

1

i

2

shot

3

an

4

elephant

5

in

6

my

7

pajamas

0

NP

S

1

V

VP

VP

2

DT

NP

NP

3

N

X

S -> NP VP

PP -> P NP

NP -> DT N | DT X | 'i'

X -> N PP

VP -> V NP | VP PP

DT -> 'an' | 'my'

N -> 'elephant' | 'pajamas'

V -> 'shot'

P -> 'in'

4

P

PP

5

DT

NP

6

Span of V is (1, 2)

Span of NP is (2, 7)

Span of VP is (1, 7)

N

Span of VP is (1, 4)

Span of NP is (4, 7)

Span of VP is (1, 7)

ex 3 backpointers for i shot an elephant
Ex. 3: Backpointers for i shot an elephant

1

i

2

shot

3

an

4

elephant

0

{ 'NP':[] }

{ 'S':

[((0,1), (1,4))]

}

1

{ 'V':[] }

{ 'VP':

[((1,2), (2,4))]

}

2

{ 'DT‘:[] }

{ 'NP‘:

[((2,3), (3,4))]

}

3

{ 'N‘:[] }

recursively follow backpointers to find parse
Recursively follow backpointers to find parse

1

i

2

shot

3

an

4

elephant

0

{ 'NP':[] }

{ 'S':

[((0,1), (1,4))]

}

S

NP

VP

1

{ 'V':[]}

{ 'VP':

[((1,2), (2,4))]

}

i

V

NP

shot

DT

N

2

{ 'DT':[] }

{ 'NP':

[((2,3), (3,4))]

}

an

elephant

3

{ 'N':[] }

backpointers example 4
Backpointers: example 4
  • N  shot
  • V  shot
  • Word at index 4 is ‘shot’.
  • Cell (4, 5) stores: { 'N': [ ],

'V': [ ] }

  • When we recursively follow backpointers, since a cell may have multiple nonterminals, the return value should be a list of parses.
outline2
Outline
  • Cocke-Kasami-Younger (CKY) algorithm
  • Backpointers for CKY
  • Bottom-up parsing
  • Structural ambiguity
  • Short assignment #17
top down parsing is not directed by the input
Top-down parsing is not directed by the input
  • Top-down parsing generates predictions of trees, according to CFG rules. These trees are ruled out by the symbols in the input string
  • Example:
    • S  0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9
    • Suppose the input is 9. Since the parsing algorithm (as presented) puts parser states on the stack going from left to right, the parser will have to construct and reject many trees before predicting the terminal 9
  • Parsing could be more efficient if the input symbol predicts which CFG rule should be used
bottom up parsing
Bottom-up parsing
  • Scan input symbols from left to right
  • As we go from left to right:
    • When there is a CFG rule that generates the input, whether directly or recursively, construct a parent node
  • Similar to shift-reduce parsing
bottom up parse of a flight left
Bottom-up parse of a flight left
  • S  NP VP
  • NP  DT N
  • DT  a
  • N  flight
  • VP  V
  • V  left
  • Read ‘a’. Construct DT  a (i.e., shift ‘a’, reduce ‘a’ to DT)
  • Read ‘flight’. Construct N  flight
  • Pop DT and N, construct NP  DT N
  • Read ‘left’. Construct V  left
  • Pop V, construct VP  V
  • Pop NP and VP, construct S  NP VP
  • S is the single nonterminal on top of stack, and we’ve read all input. We are done.

(N, flight)

(V, left)

(VP, (V, left))

(S, (NP, (DT, a), (N, flight)),

(VP, (V, left)))

(DT, a)

(NP, (DT, a), (N, flight))

failure a flight left boston
Failure: a flight left boston
  • S  NP VP
  • NP  DT N | boston
  • DT  a
  • N  flight
  • VP  V | V NP
  • V  left
  • After scanning ‘a flight left’, we have an S. Parsing isn’t complete because there is another word.
  • Read ‘boston’, construct NP.
  • Cannot shift because no input symbols remain.

Cannot reduce, because there is no rule NONTERM  S NP.

Parsing fails.

(NP, boston)

(S, (NP, (DT, a), (N, flight)),

(VP, (V, left)))

options can either shift or reduce a flight left boston
Options: can either shift or reducea flight left boston
  • S  NP VP
  • NP  DT N | boston
  • DT  a
  • N  flight
  • VP  V | V NP
  • V  left
  • After scanning ‘left’ and reducing it to a V, we can either:
    • Reduce: construct VP  V
    • Shift: scan 'boston', construct NP  boston

(NP, boston)

(VP, (V, left))

(V, left)

(NP, (DT, a), (N, flight))

(NP, (DT, a), (N, flight))

bottom up parsing with backtracking stack of stacks
Bottom-up parsing with backtracking:stack of stacks
  • S  NP VP
  • NP  DT N | boston
  • DT  a
  • N  flight
  • VP  V | V NP
  • V  left
  • Now we have a stack of stacks.
  • When there are multiple options for the next action, copy the current stack to new stacks, where each proceeds with one of the possible actions
  • Always parse with the top stack.
slide21

Bottom-up parsing with backtracking: stack of stacksAfter building (V, left), top stack reduces V to VP bottom stack shifts ‘boston’

(NP, boston)

Reject parse,

pop stack

(VP, (V, left))

(S, (NP, (DT, a), (N, flight)),

(VP, (V, left)))

(V, left)

(NP, (DT, a), (N, flight))

(NP, boston)

(N, flight)

(V, left)

(VP, (V, left), (NP, boston))

(S, (NP, (DT, a), (N, flight)),

(VP, (V, left), (NP, boston)))

Accept parse

(DT, a)

(NP, (DT, a), (N, flight))

problems of bottom up parsing
Problems of bottom-up parsing
  • Doesn’t take advantage of CFG rules to direct parse
  • Like top-down parsing, there is a potentially huge search space of possible parser actions
    • Search procedure: should you shift or reduce?
    • An algorithm imposes a strategy to prioritize shifting or reducing
    • But any strategy could be defeated by an appropriate CFG and input sentence, and take a maximum number of steps to parse
  • Like top-down parsing, there can be repeated parsing of constituents
    • A rejected parse might build a constituent that is necessary in a correct parse.
    • The stack that leads to the correct parse might have to re-parse that constituent.
outline3
Outline
  • Cocke-Kasami-Younger (CKY) algorithm
  • Backpointers for CKY
  • Bottom-up parsing
  • Structural ambiguity
  • Short assignment #17
structural ambiguity
Structural ambiguity
  • A sentence is ambiguous if there are multiple derivations from a CFG that produce that sentence
    • Each derivation can be represented by a different phrase structure tree
  • Ambiguity is rampant in natural language
  • Results in an exponential number of phrase structures for a sentence
sources of ambiguity
Sources of ambiguity
  • Lexical ambiguity
    • fish: Noun or Verb
  • PP attachment ambiguity
    • I saw the man on the hill with a telescope
    • Multiple attachment sites for PP:

NP  NP PP

VP  VP PP

sources of ambiguity1
Sources of ambiguity
  • Compound nouns

NP  NN

NP  NN NN

NP  NP NP

  • Example: water meter cover screw (Berwick)
    • [water meter] [cover screw]
    • [[water meter] cover] screw
    • water [[meter cover] screw]
    • water [meter [cover screw]
    • water [[meter cover] screw]
    • [[water] meter] cover] screw
sources of ambiguity2
Sources of ambiguity
  • Conjunctions: any two constituents can be conjoined to result in the same type of constituent

VP  VP and VP

NP  NP or NP

S  S and S

  • (Partial) sentence from a 30 million word corpus:

Combine grapefruit with bananas, strawberries and bananas, bananas and melon balls, raspberries or strawberries and melon balls, seedless white grapes and melon balls, or pineapple cubes with orange slices...

    • # of parses with 10 conjuncts is 103,049
more syntactic ambiguity
More syntactic ambiguity
  • Prepositional phrases
    • They cooked the beans in the pot with handles on the stove
  • Particle vs. preposition
    • A good pharmacist dispenses with accuracy
    • The puppy tore up the staircase
  • Complement structures
    • The tourists objected to the guide they couldn’t hear
    • She knows you like the back of her hand
more syntactic ambiguity1
More syntactic ambiguity
  • Gerund vs. participial adjective
    • Visiting relatives can be boring
    • Changing schedules frequently confused passengers
  • Modifier scope within NPs
    • impractical design requirements
    • plastic cup holder
  • Multiple gap constructions
    • The chicken is ready to eat
    • The contractors are rich enough to sue
  • Coordination scope
    • Mice can squeeze into holes or cracks in the wall
outline4
Outline
  • Cocke-Kasami-Younger (CKY) algorithm
  • Backpointers for CKY
  • Bottom-up parsing
  • Structural ambiguity
  • Short assignment #17
due 10 31
Due 10/31
  • Show a table with backpointers for a CKY parse of the string abab using this CFG. Follow backpointers and draw all parse trees.

S  X Y

X  A Y | a

Y  B X | b

A  a

B  b