1 / 23

Generalized Inference with Multiple Semantic Role Labeling Systems

Generalized Inference with Multiple Semantic Role Labeling Systems. Peter Koomen, Vasin Punyakanok, Dan Roth, (Scott) Wen-tau Yih Department of Computer Science University of Illinois at Urbana-Champaign. Outline. System Architecture Pruning Argument Identification Argument Classification

naasir
Download Presentation

Generalized Inference with Multiple Semantic Role Labeling Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Generalized Inference withMultiple Semantic Role Labeling Systems Peter Koomen, Vasin Punyakanok, Dan Roth, (Scott) Wen-tau Yih Department of Computer Science University of Illinois at Urbana-Champaign

  2. Outline • System Architecture • Pruning • Argument Identification • Argument Classification • Inference [main difference from other systems] • Inference with Multiple Systems • The same approach used by the SRL to assure a coherent output is used with input produced by multiple systems.

  3. System Architecture • Identify argument candidates • Pruning • Argument Identifier • Binary classification • Classify argument candidates • Argument Classifier • Multi-class classification • Inference • Use the estimated probability distribution given by the argument classifier, and • Expressive structural and linguistic constraints. • Infer the optimal global output – modeled as a constrained optimization problem

  4. Pruning [Xue&Palmer 2004] • Significant errors due to PP attachment • Consider PP as attached to both NP and VP

  5. Modified Pruning

  6. Argument Identification • Argument identifier is trained with a phrase-based classifier. • Learning Algorithm – SNoW • A sparse network of linear classifiers • Weight update: a regularized variation of the Winnow multiplicative update rule • When probability estimation is needed, we use softmax

  7. Argument Identification (Features) • Parse tree structure from Collins & Charniak’s parsers • Clauses, chunks and POS tags are from UPC processors

  8. Argument Classification • Similar to argument identification, using SNoW as a multi-class classifier • Classes also include NULL

  9. Inference • Occasionally, the output of the argument classifier violates some constraints. • The inference procedure [Punyakanok et al., 2004] • Input: the probability estimation (by the argument classifier), and structural and linguistic constraints • Output: the best legitimate global predictions • Formulated as an optimization problem and solved via Integer Linear Programming. • Allows incorporating expressive (non-sequential) constraints on the variables (the arguments types).

  10. Integer Linear Programming Inference • For each argument ai • Set up a Boolean variable: ai,tindicating if ai is classified as t • Goal is to maximize • i score(ai = t ) ai,t • Subject to the (linear) constraints • Any Boolean constraints can be encoded this way. • If score(ai = t ) = P(ai = t ), the objective is find the assignment that maximizes the expected number of arguments that are correct and satisfies the constraints

  11. Constraints • No overlapping or embedding arguments ai,aj overlap or embed: ai,NULL +aj,NULL  1

  12. Constraints • Constraints • No overlapping or embedding arguments • No duplicate argument classes for A0-A5 • Exactly one V argument per predicate • If there is a C-V, there must be V-A1-C-V pattern • If there is an R-arg, there must be arg somewhere • If there is a C-arg, there must be arg somewhere before • Each predicate can take only core arguments that appear in its frame file. • More specifically, we check for only the minimum and maximum ids

  13. Results

  14. Inference with Multiple Systems • The performance of SRL heavily depends on the very first stage – pruning [IJCAI 2005] • which is derived directly from the full parse trees • Joint Inference allows improvement over semantic role labeling classifiers • Combine different SRL systems through joint inference • Systems are derived using different full parse trees

  15. Inference with Multiple Systems • Multiple Systems • Train and test with Collins’ parse outputs • Train with Charniak’ best parse outputs • Test with 5-best Charniak’ parse outputs

  16. Naïve Joint Inference ..., traders say, unable to cool the selling panic in both stocks and futures. traders the selling panic in both stocks and futures a1 a1 a4 traders the selling panic in both stocks and futures b1 b2 b3

  17. Joint Inference – Phantom Candidates a1 a1 a4 a2 a3 b1 b2 b3 b4 Default Priors

  18. Results of Joint Inference

  19. Results of Joint Inference

  20. Results of Joint Inference

  21. Results of Different Combination

  22. Conclusion • The ILP inference can naturally be extended to reason over multiple SRL systems.

  23. Thank You

More Related