- 202 Views
- Uploaded on
- Presentation posted in: General

Hidden Markov Models

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Hidden Markov Models

戴玉書

L.R Rabiner, B. H. Juang, An Introduction to Hidden Markov Models

Ara V. Nefian and Monson H. Hayeslll, Face detection and recognition using Hidden Markov Models

- Markov Chain & Markov Models
- Hidden Markov Models
- HMM Problem
-Evaluation

-Decoding

-Learning

- Application

- Markov Chain & Markov Models
- Hidden Markov Models
- HMM Problem
-Evaluation

-Decoding

-Learning

- Application

- Probability of each subsequent state depends only on what was the previous state

State

State

State

- Markov Chain & Markov Models
- Hidden Markov Models
- HMM Problem
-Evaluation

-Decoding

-Learning

- Application

- If you don’t have complete state information, but some
observations at each state

N - number of states :

M - the number of observables:

……

q1

q2

q3

q4

State:{ , , }

Observable:{ , }

0.1

0.3

0.9

0.7

0.8

0.2

- M=(A, B, )

= initial probabilities : =(i) , i= P(si)

- Markov Chain & Markov Models
- Hidden Markov Models
- HMM Problem
-Evaluation

-Decoding

-Learning

- Application

- Determine the probability that a particular sequence of symbols O was generated by that model

- Initialization:
- Forward recursion:
- Termination:

- Initialization:
- Backward recursion:
- Termination:

- Markov Chain & Markov Models
- Hidden Markov Models
- HMM Problem
-Evaluation

-Decoding

-Learning

- Application

- Given a set of symbols O determine the most likely
sequence of hidden states Q that led to the

observations

- We want to find the state sequence Q which
- maximizes P(Q|o1,o2,...,oT)

s1

si

sN

sj

qt-1 qt

a1j

aij

aNj

General idea:

if best path ending in qt= sj goes through qt-1= si then it should coincide with best path ending in qt-1= si

- Initialization:
- Forward recursion:
- Termination:

- Markov Chain & Markov Models
- Hidden Markov Models
- HMM Problem
-Evaluation

-Decoding

-Learning

- Application

- Given a coarse structure of the model, determine HMM parameters M=(A, B, ) that best fit training
data

determine these parameters

- Define variable t(i,j) as the probability of being in state si at time t and in state sj at time t+1, given the observation sequence o1, o2, ... ,oT

- Define variable k(i) as the probability of being in state si at time t, given the observation sequence
o1,o2 ,...,oT

- Markov Chain & Markov Models
- Hidden Markov Models
- HMM Problem
-Evaluation problem

-Decoding problem

-Learning problem

- Application

s1

s2

s3

- The structure of hidden states:
- Observation = number of islands in the vertical slice

{1,3,2,1}

- After character image segmentation the following sequence
of island numbers in 4 slices was observed :

- The structure of hidden states:

- A set of face images is used in the training of one HMM model
N =6 states

Image:48, Training:9, Correct detection:90%,Pixels:60X90

- Each individual in the database is represent by an HMM face model
- A set of images representing different instances of same face are used to train each HMM

N =6 states

Image:400, Training :Half, Individual:40, Pixels:92X112