Loading in 5 sec....

Posterior Regularization for Structured Latent Variable ModelsPowerPoint Presentation

Posterior Regularization for Structured Latent Variable Models

- 144 Views
- Uploaded on

Download Presentation
## PowerPoint Slideshow about ' Posterior Regularization for Structured Latent Variable Models' - krysta

**An Image/Link below is provided (as is) to download presentation**

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

Outline Models

- Motivation and Introduction
- Posterior Regularization
- Application
- Implementation
- Some Related Frameworks

Motivation and Introduction Models

Prior Knowledge

We posses a wealth of prior knowledge about most NLP tasks.

Motivation and Introduction Models--Prior Knowledge

Motivation and Introduction Models--Prior Knowledge

Motivation and Introduction Models

Leveraging Prior Knowledge

Possible approaches and their limitations

Motivation and Introduction Models--Limited Approach

Bayesian Approach : Encode prior knowledge with a prior on parameters

- Limitation:Our prior knowledge is not about parameters!
- Parameters are difficult to interpret; hard to get desired effect.

Motivation and Introduction Models--Limited Approach

Augmenting Model : Encode prior knowledge with additional variables and dependencies.

limitation: may make exact inference intractable

Posterior Regularization Models

- A declarative language for specifying prior knowledge
-- Constraint Features & Expectations

- Methods for learning with knowledge in this language
-- EM style learning algorithm

Posterior Regularization Models

Posterior Regularization Models

Original Objective :

Posterior Regularization Models

EM style learning algorithm

Posterior Regularization Models

Computing the Posterior Regularizer

Application Models

One feature for each source word m, that counts how many times it is aligned to a target word in the alignment y.

Application Models

Define feature for each target-source position pair i,j . The feature takes the value zero in expectation if a word pair i ,j is aligned with equal probability in both directions.

Application Models

Learning Tractable Word Alignment Models with Complex Constraints CL10

Application Models

- Six language pairs
- both types of constraints improve over the HMM in terms of both precision and recall
- improve over the HMM by 10% to 15%
- S-HMM performs slightly better than B-HMM
- S-HMM performs better than B-HMM in 10 out of 12 cases
- improve over IBM M4 9 times out of 12

Application Models

Implementation Models

- http://code.google.com/p/pr-toolkit/

Some Related Frameworks Models

Some Related Frameworks Models

Some Related Frameworks Models

Some Related Frameworks Models

Some Related Frameworks Models

Download Presentation

Connecting to Server..