slide1
Download
Skip this Video
Download Presentation
Hidden Markov Models David Meir Blei November 1, 1999

Loading in 2 Seconds...

play fullscreen
1 / 33

Hidden Markov Models David Meir Blei November 1, 1999 - PowerPoint PPT Presentation


  • 235 Views
  • Uploaded on

Hidden Markov Models David Meir Blei November 1, 1999. Graphical Model Circles indicate states Arrows indicate probabilistic dependencies between states. What is an HMM?. Green circles are hidden states Dependent only on the previous state

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Hidden Markov Models David Meir Blei November 1, 1999' - lincoln


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

Hidden Markov Models

David Meir Blei

November 1, 1999

what is an hmm
Graphical Model

Circles indicate states

Arrows indicate probabilistic dependencies between states

What is an HMM?
what is an hmm3
Green circles are hidden states

Dependent only on the previous state

“The past is independent of the future given the present.”

What is an HMM?
what is an hmm4
Purple nodes are observed states

Dependent only on their corresponding hidden state

What is an HMM?
hmm formalism
{S, K, P, A, B}

S : {s1…sN } are the values for the hidden states

K : {k1…kM } are the values for the observations

HMM Formalism

S

S

S

S

S

K

K

K

K

K

hmm formalism6
{S, K, P, A, B}

P = {pi} are the initial state probabilities

A = {aij} are the state transition probabilities

B = {bik} are the observation state probabilities

HMM Formalism

A

A

A

A

S

S

S

S

S

B

B

B

K

K

K

K

K

inference in an hmm
Compute the probability of a given observation sequence

Given an observation sequence, compute the most likely hidden state sequence

Given an observation sequence and set of possible models, which model most closely fits the data?

Inference in an HMM
decoding

oT

Decoding

o1

ot-1

ot

ot+1

Given an observation sequence and a model, compute the probability of the observation sequence

decoding9

x1

xt-1

xt

xt+1

xT

o1

ot-1

ot

ot+1

oT

Decoding
decoding10

x1

xt-1

xt

xt+1

xT

o1

ot-1

ot

ot+1

oT

Decoding
decoding11

x1

xt-1

xt

xt+1

xT

o1

ot-1

ot

ot+1

oT

Decoding
decoding12

x1

xt-1

xt

xt+1

xT

o1

ot-1

ot

ot+1

oT

Decoding
decoding13

x1

xt-1

xt

xt+1

xT

o1

ot-1

ot

ot+1

oT

Decoding
forward procedure

x1

xt-1

xt

xt+1

xT

o1

ot-1

ot

ot+1

oT

Forward Procedure
  • Special structure gives us an efficient solution using dynamic programming.
  • Intuition: Probability of the first t observations is the same for all possible t+1 length state sequences.
  • Define:
forward procedure15

x1

xt-1

xt

xt+1

xT

o1

ot-1

ot

ot+1

oT

Forward Procedure
forward procedure16

x1

xt-1

xt

xt+1

xT

o1

ot-1

ot

ot+1

oT

Forward Procedure
forward procedure17

x1

xt-1

xt

xt+1

xT

o1

ot-1

ot

ot+1

oT

Forward Procedure
forward procedure18

x1

xt-1

xt

xt+1

xT

o1

ot-1

ot

ot+1

oT

Forward Procedure
forward procedure19

x1

xt-1

xt

xt+1

xT

o1

ot-1

ot

ot+1

oT

Forward Procedure
forward procedure20

x1

xt-1

xt

xt+1

xT

o1

ot-1

ot

ot+1

oT

Forward Procedure
forward procedure21

x1

xt-1

xt

xt+1

xT

o1

ot-1

ot

ot+1

oT

Forward Procedure
forward procedure22

x1

xt-1

xt

xt+1

xT

o1

ot-1

ot

ot+1

oT

Forward Procedure
backward procedure
Backward Procedure

x1

xt-1

xt

xt+1

xT

o1

ot-1

ot

ot+1

oT

Probability of the rest of the states given the first state

decoding solution
Decoding Solution

x1

xt-1

xt

xt+1

xT

o1

ot-1

ot

ot+1

oT

Forward Procedure

Backward Procedure

Combination

viterbi algorithm
Viterbi Algorithm

x1

xt-1

j

o1

ot-1

ot

ot+1

oT

The state sequence which maximizes the probability of seeing the observations to time t-1, landing in state j, and seeing the observation at time t

viterbi algorithm27

o1

ot-1

ot

ot+1

oT

Viterbi Algorithm

x1

xt-1

xt

xt+1

Recursive Computation

viterbi algorithm28

o1

ot-1

ot

ot+1

oT

Viterbi Algorithm

x1

xt-1

xt

xt+1

xT

Compute the most likely state sequence by working backwards

parameter estimation

o1

ot-1

ot

ot+1

oT

Parameter Estimation

A

A

A

A

B

B

B

B

B

  • Given an observation sequence, find the model that is most likely to produce that sequence.
  • No analytic method
  • Given a model and observation sequence, update the model parameters to better fit the observations.
parameter estimation30

o1

ot-1

ot

ot+1

oT

Parameter Estimation

A

A

A

A

B

B

B

B

B

Probability of traversing an arc

Probability of being in state i

parameter estimation31

o1

ot-1

ot

ot+1

oT

Parameter Estimation

A

A

A

A

B

B

B

B

B

Now we can compute the new estimates of the model parameters.

the most important thing

o1

ot-1

ot

ot+1

oT

The Most Important Thing

A

A

A

A

B

B

B

B

B

We can use the special structure of this model to do a lot of neat math and solve problems that are otherwise not solvable.

ad