forward backward algorithm l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Forward-backward algorithm PowerPoint Presentation
Download Presentation
Forward-backward algorithm

Loading in 2 Seconds...

play fullscreen
1 / 29

Forward-backward algorithm - PowerPoint PPT Presentation


  • 237 Views
  • Uploaded on

Forward-backward algorithm. LING 572 Fei Xia 02/23/06. Outline. Forward and backward probability Expected counts and update formulae Relation with EM. HMM. A HMM is a tuple : A set of states S={s 1 , s 2 , …, s N }. A set of output symbols Σ ={w 1 , …, w M }.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Forward-backward algorithm' - jana


Download Now An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
forward backward algorithm

Forward-backward algorithm

LING 572

Fei Xia

02/23/06

outline
Outline
  • Forward and backward probability
  • Expected counts and update formulae
  • Relation with EM
slide3
HMM
  • A HMM is a tuple :
    • A set of states S={s1, s2, …, sN}.
    • A set of output symbols Σ={w1, …, wM}.
    • Initial state probabilities
    • State transition prob: A={aij}.
    • Symbol emission prob: B={bijk}
  • State sequence: X1…XT+1
  • Output sequence: o1…oT
decoding

oT

o1

o2

XT+1

XT

X1

X2

Decoding
  • Given the observation O1,T=o1…oT, find the state sequence X1,T+1=X1 … XT+1 that maximizes P(X1,T+1 | O1,T).

 Viterbi algorithm

notation
Notation
  • A sentence: O1,T=o1…oT,
  • T is the sentence length
  • The state sequence X1,T+1=X1 … XT+1
  • t: time t, range from 1 to T+1.
  • Xt: the state at time t.
  • i, j: state si, sj.
  • k: word wk in the vocabulary
forward probability
Forward probability

The probability of producing oi,t-1 while ending up in state si:

calculating forward probability
Calculating forward probability

Initialization:

Induction:

backward probability
Backward probability
  • The probability of producing the sequence Ot,T, given that at time t, we are at state si.
calculating backward probability
Calculating backward probability

Initialization:

Induction:

estimating parameters
Estimating parameters
  • The prob of traversing a certain arc at time t given O: (denoted by pt(i, j) in M&S)
expected counts
Expected counts

Sum over the time index:

  • Expected # of transitions from state i to j in O:
  • Expected # of transitions from state i in O:
emission probabilities
Emission probabilities

Arc-emission HMM:

the inner loop for forward backward algorithm
The inner loop for forward-backward algorithm

Given an input sequence and

  • Calculate forward probability:
    • Base case
    • Recursive case:
  • Calculate backward probability:
    • Base case:
    • Recursive case:
  • Calculate expected counts:
  • Update the parameters:
relation to em21
Relation to EM
  • HMM is a PM (Product of Multi-nominal) Model
  • Forward-back algorithm is a special case of the EM algorithm for PM Models.
  • X (observed data): each data point is an O1T.
  • Y (hidden data): state sequence X1T.
  • Θ (parameters): aij, bijk, πi.
iterations
Iterations
  • Each iteration provides values for all the parameters
  • The new model always improve the likeliness of the training data:
  • The algorithm does not guarantee to reach global maximum.
summary
Summary
  • A way of estimating parameters for HMM
    • Define forward and backward probability, which can calculated efficiently (DP)
    • Given an initial parameter setting, we re-estimate the parameters at each iteration.
    • The forward-backward algorithm is a special case of EM algorithm for PM model
definitions so far
Definitions so far
  • The prob of producing O1,t-1, and ending at state si at time t:
  • The prob of producing the sequence Ot,T, given that at time t, we are at state si:
  • The prob of being at state i at time t given O:
emission probabilities29
Emission probabilities

Arc-emission HMM:

State-emission HMM: