Hidden markov models
Download
1 / 29

Hidden Markov Models - PowerPoint PPT Presentation


  • 247 Views
  • Uploaded on

Hidden Markov Models. 戴玉書. L.R Rabiner, B. H. Juang, An Introduction to Hidden Markov Models Ara V. Nefian and Monson H. Hayeslll, Face detection and recognition using Hidden Markov Models. Outline. Markov Chain & Markov Models

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Hidden Markov Models' - hector-skinner


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Hidden markov models

Hidden Markov Models

戴玉書

L.R Rabiner, B. H. Juang, An Introduction to Hidden Markov Models

Ara V. Nefian and Monson H. Hayeslll, Face detection and recognition using Hidden Markov Models


Outline
Outline

  • Markov Chain & Markov Models

  • Hidden Markov Models

  • HMM Problem

    -Evaluation

    -Decoding

    -Learning

  • Application


Outline1
Outline

  • Markov Chain & Markov Models

  • Hidden Markov Models

  • HMM Problem

    -Evaluation

    -Decoding

    -Learning

  • Application


Markov chain property
Markov chain property:

  • Probability of each subsequent state depends only on what was the previous state


Markov models
Markov Models

State

State

State


Outline2
Outline

  • Markov Chain & Markov Models

  • Hidden Markov Models

  • HMM Problem

    -Evaluation

    -Decoding

    -Learning

  • Application


Hidden markov models1
Hidden Markov Models

  • If you don’t have complete state information, but some

    observations at each state

N - number of states :

M - the number of observables:

……

q1

q2

q3

q4


Hidden markov models2
Hidden Markov Models

State:{ , , }

Observable:{ , }

0.1

0.3

0.9

0.7

0.8

0.2


Hidden markov models3
Hidden Markov Models

  • M=(A, B, )

 = initial probabilities : =(i) , i= P(si)


Outline3
Outline

  • Markov Chain & Markov Models

  • Hidden Markov Models

  • HMM Problem

    -Evaluation

    -Decoding

    -Learning

  • Application


Evaluation
Evaluation

  • Determine the probability that a particular sequence of symbols O was generated by that model


Forward recursion
Forward recursion

  • Initialization:

  • Forward recursion:

  • Termination:


Backward recursion
Backward recursion

  • Initialization:

  • Backward recursion:

  • Termination:


Outline4
Outline

  • Markov Chain & Markov Models

  • Hidden Markov Models

  • HMM Problem

    -Evaluation

    -Decoding

    -Learning

  • Application


Decoding
Decoding

  • Given a set of symbols O determine the most likely

    sequence of hidden states Q that led to the

    observations

  • We want to find the state sequence Q which

  • maximizes P(Q|o1,o2,...,oT)


Viterbi algorithm

s1

si

sN

sj

qt-1 qt

a1j

aij

aNj

Viterbi algorithm

General idea:

if best path ending in qt= sj goes through qt-1= si then it should coincide with best path ending in qt-1= si


Viterbi algorithm1
Viterbi algorithm

  • Initialization:

  • Forward recursion:

  • Termination:



Outline5
Outline

  • Markov Chain & Markov Models

  • Hidden Markov Models

  • HMM Problem

    -Evaluation

    -Decoding

    -Learning

  • Application


Learning problem
Learning problem

  • Given a coarse structure of the model, determine HMM parameters M=(A, B, ) that best fit training

    data

    determine these parameters


Baum welch algorithm
Baum-Welch algorithm

  • Define variable t(i,j) as the probability of being in state si at time t and in state sj at time t+1, given the observation sequence o1, o2, ... ,oT


Baum welch algorithm1
Baum-Welch algorithm

  • Define variable k(i) as the probability of being in state si at time t, given the observation sequence

    o1,o2 ,...,oT


Outline6
Outline

  • Markov Chain & Markov Models

  • Hidden Markov Models

  • HMM Problem

    -Evaluation problem

    -Decoding problem

    -Learning problem

  • Application


Example 1 character recognition

s1

s2

s3

Example 1 -character recognition

  • The structure of hidden states:

  • Observation = number of islands in the vertical slice


Example 1 character recognition1
Example 1 -character recognition

{1,3,2,1}

  • After character image segmentation the following sequence

    of island numbers in 4 slices was observed :


Example 2 face detection recognition
Example 2- face detection & recognition

  • The structure of hidden states:


Example 2 face detection
Example 2- face detection

  • A set of face images is used in the training of one HMM model

    N =6 states

Image:48, Training:9, Correct detection:90%,Pixels:60X90


Example 2 face recognition
Example 2- face recognition

  • Each individual in the database is represent by an HMM face model

  • A set of images representing different instances of same face are used to train each HMM

N =6 states


Example 2 face recognition1
Example 2- face recognition

Image:400, Training :Half, Individual:40, Pixels:92X112


ad