incrementally learning parameter of stochastic cfg using summary stats
Download
Skip this Video
Download Presentation
Incrementally Learning Parameter of Stochastic CFG using Summary Stats

Loading in 2 Seconds...

play fullscreen
1 / 10

Incrementally Learning Parameter of Stochastic CFG using Summary Stats - PowerPoint PPT Presentation


  • 64 Views
  • Uploaded on

Incrementally Learning Parameter of Stochastic CFG using Summary Stats. Written by:Brent Heeringa Tim Oates. Goals:. To learn the syntax of utterances Approach : SCFG (Stochastic Context Free Grammar) M=<V,E,R,S> V-finite set of non-terminal E-finite set of terminals

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Incrementally Learning Parameter of Stochastic CFG using Summary Stats' - mariah


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
incrementally learning parameter of stochastic cfg using summary stats

Incrementally Learning Parameter of Stochastic CFG using Summary Stats

Written by:Brent Heeringa

Tim Oates

goals
Goals:
  • To learn the syntax of utterances

Approach:

  • SCFG (Stochastic Context Free Grammar)

M=<V,E,R,S>

V-finite set of non-terminal

E-finite set of terminals

R-finite set of rules, each r has p(r).

Sum of p(r) of the same left-hand side = 1

S-start symbol

problems with most scfg learning algorithms
Problems with most SCFG Learning Algorithms

1)Expensive storage: need to store a corpus of complete sentences

2)Time-consuming: algorithms needs to repeat passes throughout all data

learning scfg
Learning SCFG
  • Inducing context-free structure from corpus(sentences)
  • Learning – the production(rules) probabilities
learning scfg cont
General method: Inside/Outside algorithm

Expectation-Maximization (EM)

Find expectation of rules

Maximize the likelihood given both expectation & corpus

Disadvantage of Inside/Outside algo.

Entire sentence corpus must be stored using some representation(eg. chart parse)

Expensive storage (unrealistic for human agent!)

Learning SCFG –Cont
proposed algorithm
Proposed Algorithm
  • Use Unique Normal Form (UNF)
    • Replace all terminal A-z to 2 new rules
      • A->D p[A->D]=p[A->z]
      • D-> z p[D->z]=1
    • No two productions have the same right hand side
learning scfg proposed algorithm cont
Learning SCFG- Proposed Algorithm -cont
  • Use Histogram
    • Each rule has 2 histograms (Hor, HLr)
proposed algorithm cont
Proposed Algorithm -cont
    • Hor -contructed when parsing sentences in O
    • HLr- -will continue to be updated throughout learning process
  • HLr rescale to fixed size h
    • Why?!
    • Recently used rules has more impact on histogram
comparing between h l r h o r
Comparing between HLr & Hor
  • Relative entropy
  • T decrease- increase prob of rules used
    • (if s large, increase prob of rules used when parsing last sentence )
  • T increase- decrease prob of rules used

(eg pt+1(r)=0.01* p t+1(r)

comparing inside outside algo with the proposed algorithm
Inside/Outside

O(n3)

Good

3-5 iterations

Bad

Need to store complete sentence corpus

Proposed Algo

O(n3)

Bad

500-1000 iterations

Good

Memory requirements is constant!

Comparing Inside/Outside Algo with the proposed algorithm
ad