Probabilistic cfgs probabilistic parsing
Download
1 / 27

PROBABILISTIC CFGs & PROBABILISTIC PARSING - PowerPoint PPT Presentation


  • 141 Views
  • Uploaded on

PROBABILISTIC CFGs & PROBABILISTIC PARSING. Universita’ di Venezia 3 Ottobre 2003. Probabilistic CFGs. Context-Free Grammar Rules are of the form: S  NP VP In a Probabilistic CFG, we assign a probability to these rules: S  NP VP, P(SNP,VP|S). Why PCFGs?.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'PROBABILISTIC CFGs & PROBABILISTIC PARSING' - bishop


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Probabilistic cfgs probabilistic parsing

PROBABILISTIC CFGs &PROBABILISTIC PARSING

Universita’ di Venezia

3 Ottobre 2003


Probabilistic cfgs
Probabilistic CFGs

  • Context-Free Grammar Rules are of the form:

    • S  NP VP

  • In a Probabilistic CFG, we assign a probability to these rules:

    • S  NP VP, P(SNP,VP|S)


Why pcfgs
Why PCFGs?

DISAMBIGUATION: with a PCFG, probabilities can be used to choose the most likely parse

ROBUSTNESS: rather than excluding things, a PCFG may assign them a very low probability

LEARNING: CFGs cannot be learned from positive data only



Pcfgs in prolog courtesy doug arnold
PCFGs in Prolog (courtesy Doug Arnold)

s(P0, [s,NP,VP] ) --> np(P1,NP), vp(P2,VP), { P0 is 1.0*P1*P2 }.

….vp(P0, [vp,V,NP] ) --> v(P1,V), np(P2,NP ), { P0 is 0.7*P1*P2 }.



Independence assumptions
Independence assumptions

PCFGs specify a language model, just like n-grams

We need however to make some independence assumptions yet again: the probability of a subtree is independent of:



Using pcfgs to disambiguate astronomers saw stars with ears
Using PCFGs to disambiguate: “Astronomers saw stars with ears”




Parsing with pcfgs a comparison with hmms
Parsing with PCFGs:A comparison with HMMs

An HMM defines a REGULAR GRAMMAR:


Parsing with cfgs a comparison with hmms
Parsing with CFGs: A comparison with HMMs


Inside and outside probabilities cfr forward and backward probabilities for hmms
Inside and outside probabilities(cfr. forward and backward probabilities for HMMs)









Learning probabilities
Learning probabilities

Reconstruct the rules used in the analysis of the Treebank

Estimate probabilities by:P(AB) = C(AB) / C(A)


Probabilistic lexicalised pcfgs collins 1997 charniak 2000
Probabilistic lexicalised PCFGs(Collins, 1997; Charniak, 2000)




Readings
Readings

  • Manning and Schütze, chapters 11 and 12


Acknowledgments
Acknowledgments

  • Some slides and the Prolog code are borrowed from Doug Arnold

  • Thanks also to Chris Manning & Diego Molla


ad