This presentation is the property of its rightful owner.
1 / 29

# Hidden Markov Models PowerPoint PPT Presentation

Hidden Markov Models. 戴玉書. L.R Rabiner, B. H. Juang, An Introduction to Hidden Markov Models Ara V. Nefian and Monson H. Hayeslll, Face detection and recognition using Hidden Markov Models. Outline. Markov Chain & Markov Models

Hidden Markov Models

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

## Hidden Markov Models

L.R Rabiner, B. H. Juang, An Introduction to Hidden Markov Models

Ara V. Nefian and Monson H. Hayeslll, Face detection and recognition using Hidden Markov Models

### Outline

• Markov Chain & Markov Models

• Hidden Markov Models

• HMM Problem

-Evaluation

-Decoding

-Learning

• Application

### Outline

• Markov Chain & Markov Models

• Hidden Markov Models

• HMM Problem

-Evaluation

-Decoding

-Learning

• Application

### Markov chain property:

• Probability of each subsequent state depends only on what was the previous state

State

State

State

### Outline

• Markov Chain & Markov Models

• Hidden Markov Models

• HMM Problem

-Evaluation

-Decoding

-Learning

• Application

### Hidden Markov Models

• If you don’t have complete state information, but some

observations at each state

N - number of states :

M - the number of observables:

……

q1

q2

q3

q4

State:{ , , }

Observable:{ , }

0.1

0.3

0.9

0.7

0.8

0.2

### Hidden Markov Models

• M=(A, B, )

 = initial probabilities : =(i) , i= P(si)

### Outline

• Markov Chain & Markov Models

• Hidden Markov Models

• HMM Problem

-Evaluation

-Decoding

-Learning

• Application

### Evaluation

• Determine the probability that a particular sequence of symbols O was generated by that model

### Forward recursion

• Initialization:

• Forward recursion:

• Termination:

### Backward recursion

• Initialization:

• Backward recursion:

• Termination:

### Outline

• Markov Chain & Markov Models

• Hidden Markov Models

• HMM Problem

-Evaluation

-Decoding

-Learning

• Application

### Decoding

• Given a set of symbols O determine the most likely

sequence of hidden states Q that led to the

observations

• We want to find the state sequence Q which

• maximizes P(Q|o1,o2,...,oT)

s1

si

sN

sj

qt-1 qt

a1j

aij

aNj

### Viterbi algorithm

General idea:

if best path ending in qt= sj goes through qt-1= si then it should coincide with best path ending in qt-1= si

### Viterbi algorithm

• Initialization:

• Forward recursion:

• Termination:

### Outline

• Markov Chain & Markov Models

• Hidden Markov Models

• HMM Problem

-Evaluation

-Decoding

-Learning

• Application

### Learning problem

• Given a coarse structure of the model, determine HMM parameters M=(A, B, ) that best fit training

data

determine these parameters

### Baum-Welch algorithm

• Define variable t(i,j) as the probability of being in state si at time t and in state sj at time t+1, given the observation sequence o1, o2, ... ,oT

### Baum-Welch algorithm

• Define variable k(i) as the probability of being in state si at time t, given the observation sequence

o1,o2 ,...,oT

### Outline

• Markov Chain & Markov Models

• Hidden Markov Models

• HMM Problem

-Evaluation problem

-Decoding problem

-Learning problem

• Application

s1

s2

s3

### Example 1 -character recognition

• The structure of hidden states:

• Observation = number of islands in the vertical slice

### Example 1 -character recognition

{1,3,2,1}

• After character image segmentation the following sequence

of island numbers in 4 slices was observed :

### Example 2- face detection & recognition

• The structure of hidden states:

### Example 2- face detection

• A set of face images is used in the training of one HMM model

N =6 states

Image:48, Training:9, Correct detection:90%,Pixels:60X90

### Example 2- face recognition

• Each individual in the database is represent by an HMM face model

• A set of images representing different instances of same face are used to train each HMM

N =6 states

### Example 2- face recognition

Image:400, Training :Half, Individual:40, Pixels:92X112