posterior regularization for structured latent variable models
Download
Skip this Video
Download Presentation
Posterior Regularization for Structured Latent Variable Models

Loading in 2 Seconds...

play fullscreen
1 / 26

Posterior Regularization for Structured Latent Variable Models - PowerPoint PPT Presentation


  • 145 Views
  • Uploaded on

Posterior Regularization for Structured Latent Variable Models. Li Z honghua I2R SMT Reading Group. Outline. Motivation and Introduction Posterior Regularization Application Implementation Some Related Frameworks. Motivation and Introduction. Prior Knowledge

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Posterior Regularization for Structured Latent Variable Models' - krysta


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
posterior regularization for structured latent variable models

Posterior Regularization for Structured Latent Variable Models

Li Zhonghua

I2R SMT Reading Group

outline
Outline
  • Motivation and Introduction
  • Posterior Regularization
  • Application
  • Implementation
  • Some Related Frameworks
motivation and introduction
Motivation and Introduction

Prior Knowledge

We posses a wealth of prior knowledge about most NLP tasks.

motivation and introduction1
Motivation and Introduction

Leveraging Prior Knowledge

Possible approaches and their limitations

motivation and introduction limited approach
Motivation and Introduction--Limited Approach

Bayesian Approach : Encode prior knowledge with a prior on parameters

  • Limitation:Our prior knowledge is not about parameters!
  • Parameters are difficult to interpret; hard to get desired effect.
motivation and introduction limited approach1
Motivation and Introduction--Limited Approach

Augmenting Model : Encode prior knowledge with additional variables and dependencies.

limitation: may make exact inference intractable

posterior regularization
Posterior Regularization
  • A declarative language for specifying prior knowledge

-- Constraint Features & Expectations

  • Methods for learning with knowledge in this language

-- EM style learning algorithm

posterior regularization2
Posterior Regularization

Original Objective :

posterior regularization3
Posterior Regularization

EM style learning algorithm

posterior regularization4
Posterior Regularization

Computing the Posterior Regularizer

application
Application

Statistical Word Alignments

IBM Model 1 and HMM

application1
Application

One feature for each source word m, that counts how many times it is aligned to a target word in the alignment y.

application2
Application

Define feature for each target-source position pair i,j . The feature takes the value zero in expectation if a word pair i ,j is aligned with equal probability in both directions.

application3
Application

Learning Tractable Word Alignment Models with Complex Constraints CL10

application4
Application
  • Six language pairs
  • both types of constraints improve over the HMM in terms of both precision and recall
  • improve over the HMM by 10% to 15%
  • S-HMM performs slightly better than B-HMM
  • S-HMM performs better than B-HMM in 10 out of 12 cases
  • improve over IBM M4 9 times out of 12
implementation
Implementation
  • http://code.google.com/p/pr-toolkit/
slide26

more info: http://sideinfo.wikkii.com

many of my slides get from there

Thanks!

ad