csa350 nlp algorithms
Download
Skip this Video
Download Presentation
CSA350: NLP Algorithms

Loading in 2 Seconds...

play fullscreen
1 / 20

CSA350: NLP Algorithms - PowerPoint PPT Presentation


  • 101 Views
  • Uploaded on

CSA350: NLP Algorithms. Sentence Parsing I The Parsing Problem Parsing as Search Top Down/Bottom Up Parsing Strategies. References. This lecture is based on material found in Jurafsky & Martin chapter 10 Relevant material available from Vince.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' CSA350: NLP Algorithms' - rumer


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
csa350 nlp algorithms

CSA350: NLP Algorithms

Sentence Parsing I

The Parsing Problem

Parsing as Search

Top Down/Bottom Up Parsing

Strategies

csa3180: Parsing Algorithms 1

references
References
  • This lecture is based on material found in
    • Jurafsky & Martin chapter 10
  • Relevant material available from Vince.

csa3180: Parsing Algorithms 1

why not use fs techniques for parsing nl sentences
Why not use FS techniques for parsing NL sentences
  • Descriptive Adequacy
    • some NL phenomena cannot be described within FS framework.
    • example: central embedding
  • Notational Adequacy
    • Elegance with which notation describes the real-world objects. Elegance implies
      • Notation which allows short descriptions.
      • Notation which exploits similarities between different structures and permits general properties to be stated.
      • Representation of dependency and hierarchy

csa3180: Parsing Algorithms 1

central embedding
Central Embedding
  • The following sentences
    • The cat spat 1 1
    • The cat the boy saw spat 1 2 2 1
    • The cat the boy the girl liked saw spat 1 2 3 3 2 1
  • Require at least a grammar of the formS → An Bn

csa3180: Parsing Algorithms 1

dcg style grammar lexicon
s --> np, vp.

s --> aux, np, vp.

s --> vp.

np --> det nom.

nom --> noun.

nom --> noun, nom.

nom --> nom, pp

pp --> prep, np.

np --> pn.

vp --> v.

vp --> v np

d --> [that];[this];[a].

n --> [book];[flight];

[meal];[money].

v -->

[book];[include]; [prefer].

aux --> [does].

prep --> [from];[to];[on].

pn --> [‘Houston’];[‘TWA’].

DCG-style Grammar/Lexicon

csa3180: Parsing Algorithms 1

parse tree
Parse Tree

A valid parse tree for a grammar G is a tree

    • whose root is the start symbol for G
    • whose interior nodes are nonterminals of G
    • whose children of a node T (from left to right) correspond to the symbols on the right hand side of some production for T in G.
    • whose leaf nodes are terminal symbols of G.
  • Every sentence generated by a grammar has a corresponding valid parse tree
  • Every valid parse tree exactly covers a sentence generated by the grammar

csa3180: Parsing Algorithms 1

parsing problem
Given grammar G and sentence A find all valid parse trees for G that exactly cover AParsing Problem

S

VP

NP

V

Nom

Det

book

N

that

flight

csa3180: Parsing Algorithms 1

soundness and completeness
Soundness and Completeness
  • A parser is sound if every parse tree it returns is valid.
  • A parser is complete for grammar G if for all sL(G)
    • it terminates
    • it produces the corresponding parse tree
  • For many purposes, we settle for sound but incomplete parsers

csa3180: Parsing Algorithms 1

parsing as search
Parsing as Search
  • Search within a space defined by
    • Start State
    • Goal State
    • State to state transformations
  • Two distinct parsing strategies:
    • Top down
    • Bottom up
  • Different parsing strategy, different state space, different problem.
  • Parsing strategy ≠ search strategy

csa3180: Parsing Algorithms 1

top down
Top Down
  • Each state is a tree (which encodes the current state of the parse).
  • Top down parser tries to build from the root node S down to the leaves by replacing nodes with non-terminal labels with RHS of corresponding grammar rules.
  • Nodes with pre-terminal (word class) labels are compared to input words.

csa3180: Parsing Algorithms 1

top down search space
Top Down Search Space

Start node→

Goal node

csa3180: Parsing Algorithms 1

bottom up
Bottom Up
  • Each state is a forest of trees.
  • Start node is a forest of nodes labelled with pre-terminal categories (word classes derived from lexicon)
  • Transformations look for places where RHS of rules can fit.
  • Any such place is replaced with a node labelled with LHS of rule.

csa3180: Parsing Algorithms 1

bottom up search space
Bottom Up Search Space

fl

fl

fl

fl

fl

fl

fl

csa3180: Parsing Algorithms 1

top down vs bottom up general
Top down

For: Never wastes time exploring trees that cannot be derived from S

Against: Can generate trees that are not consistent with the input

Bottom up

For: Never wastes time building trees that cannot lead to input text segments.

Against: Can generate subtrees that can never lead to an S node.

Top Down vs Bottom UpGeneral

csa3180: Parsing Algorithms 1

top down parsing remarks
Top Down Parsing - Remarks
  • Top-down parsers do well if there is useful grammar driven control: search can be directed by the grammar.
  • Left recursive rules can cause problems.
  • A top-down parser will do badly if there are many different rules for the same LHS. Consider if there are 600 rules for S, 599 of which start with NP, but one of which starts with V, and the sentence starts with V.
  • Top-down is unsuitable for rewriting parts of speech (preterminals) with words (terminals). In practice that is always done bottom-up as lexical lookup.
  • Useless work: expands things that are possible top-down but not there.
  • Repeated work: anywhere there is common substructure

csa3180: Parsing Algorithms 1

bottom up parsing remarks
Bottom Up Parsing - Remarks
  • Empty categories: termination problem unless rewriting of empty constituents is somehow restricted (but then it’s generally incomplete)
  • Inefficient when there is great lexical ambiguity (grammar driven control might help here)
  • Conversely, it is data-directed: it attempts to parse the words that are there.
  • Both TD (LL) and BU (LR) parsers can do work exponential in the sentence length on NLP problems
  • Useless work: locally possible, but globally impossible.
  • Repeated work: anywhere there is common substructure

csa3180: Parsing Algorithms 1

development of a concrete strategy
Development of a Concrete Strategy
  • Combine best features of both top down and bottom up strategies.
    • Top down, grammar directed control.
    • Bottom up filtering.
  • Examination of alternatives in parallel uses too much memory.
  • Depth first strategy using agenda-based control.

csa3180: Parsing Algorithms 1

top down algorithm
Top Down Algorithm

csa3180: Parsing Algorithms 1

a problem with the algorithm
A Problem with the Algorithm
  • Note that the first three steps of the parse involve a failed attempt to expand the first ruleS → NP VP.
  • The parser recursively expands the leftmost NT of this rule (NP).
  • While all this work is going on, the input is not even consulted!
  • Only when a terminal symbol is encountered is the input compared and the failure discovered.
  • This is pretty inefficient.

csa3180: Parsing Algorithms 1

ad