1 / 28

Using Percolated Dependencies in PBSMT

CLUKI XII: April 24, 2009. Using Percolated Dependencies in PBSMT. Ankit K. Srivastava and Andy Way Dublin City University. About. Syntactic Parsing and Head Percolation. Parsing I: Constituency Structure. Vinken will join the board as a nonexecutive director Nov 29 (ROOT (S

moanna
Download Presentation

Using Percolated Dependencies in PBSMT

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CLUKI XII: April 24, 2009 Using Percolated Dependencies in PBSMT Ankit K. Srivastava and Andy Way Dublin City University

  2. About

  3. Syntactic Parsing and Head Percolation

  4. Parsing I: Constituency Structure Vinken will join the board as a nonexecutive director Nov 29 (ROOT (S (NP (NNP Vinken)) (VP (MD will) (VP (VB join) (NP (DT the) (NN board)) (PP (IN as) (NP (DT a) (JJ nonexecutive) (NN director))) (NP (NNP Nov) (CD 29))))))

  5. Parsing II: Dependency Structure Vinken will join the board as a nonexecutive director Nov 29 HEAD DEPENDENT join Vinken join will board the join board join as director a director nonexecutive as director 29 Nov join 29

  6. Parsing III: Head Percolation • It is straightforward to convert constituency tree to an unlabeled dependency tree (Gaifman 1965) • Use head percolation tables to identify head child in a constituency representation (Magerman 1995) • Dependency tree is obtained by recursively applying head child and non-head child heuristics (Xia & Palmer 2001) (NP (DT the) (NN board)) NP right NN/NNP/CD/JJ (NP-board (DT the) (NN board)) the is dependent on board

  7. Parsing IV: Three Parses • Constituency (phrase-structure) parses : CONrequires CON parser • Dependency (head-dependent) parses : DEPrequires DEP parser • Percolated (head-dependent) parses : PERCrequires CON parser + heuristics

  8. Phrase-Based Statistical Machine Translation

  9. PBSMT I: Framework • argmaxe p(e|f) = argmaxep(f|e) p(e) • Decoder, Translation Model, Language Model • PBSMT framework in Moses (Koehn et al., 2007) • Phrase Table in Translation Model := Align words + extract phrases + score phrases • Different methods to extract phrases • Moses phrase extraction as baseline system…

  10. PBSMT II: Non-syntactic Phrase Extraction • … baseline Moses • Get word alignments (src2tgt, tgt2src) • Perform grow-diag-final heuristics (Koehn et al., 2003) • Extract phrase pairs consistent with the word alignments • String-based (non-syntactic) phrases: STR

  11. PBSMT III: Syntactic Phrase Extraction • Get word alignments (src2tgt, tgt2src) • Parse src sentences • Parse tgt sentences • Use Tree Aligner to align subtree nodes (Zhechev 2009) • Extract surface-level chunks from parallel treebanks • Previously, Tinsley et al., 2007 & Hearne et al., 2008 • Syntactic phrases: CON DEP PERC

  12. System Design

  13. System I: Tools and Resources • English-French parallel corpora • Phrase Structure Parsers (En, Fr) • Dependency Structure Parsers (En, Fr) • Head Percolation tables (En, Fr) • Statistical Tree Aligner • Giza++ Word Aligner • SRILM (Language Modeling) Toolkit • Moses Decoder

  14. PERC is a unique knowledge source… System II: # Entries in Phrase tables: Europarl … but is it useful?

  15. System III: Combinations • Concatenate phrase tables and re-estimate probabilities • 15 different systems: ∑4Cr , 1≤r≤4 STR CON DEP PERC

  16. MT Systems and Evaluation

  17. Numbers I: Evaluation - JOC

  18. Numbers II: Evaluation - Europarl

  19. Numbers III: Uniquely best • Evaluate MT systems STR, CON, DEP, PERC on a per sentence level. (Translation Error Rate) • JOC (440 sentences): • Europarl (2000 sentences):

  20. Numbers IV: Adding +PERC: Europarl

  21. Analysis of Results

  22. Analysis I: STR • Using Moses baseline phrases (STR) is essential for coverage. SIZE matters! • However, adding any system to STR increases baseline score. Symbiotic! • Hence, do not replace STR, but augment it.

  23. Analysis II: CON • Seems to be the best combination with STR (S+C seems to be the best performing system) • Has most common chunks with PERC • Does PERC harm a CON system – needs more analysis

  24. Analysis III: DEP • PERC is different from DEP chunks, despite being formally equivalent • PERC can substitute DEP

  25. Analysis IV: PERC • Is a unique knowledge source. • Sometimes, it helps. • Needs more work on finding connection with CON / DEP

  26. Conclusion & Future Work

  27. Conclusion & Future Work • Extended Hearne et al., 2008 by- scaling up data size from 7.7K to 100K- introducing percolated dependencies in PBSMT • Manual evaluation • More analysis of results • More combining strategies • Seek to determine if each chunk type “owns” sentence types

  28. Thanks <asrivastava @ computing.dcu.ie>

More Related