machine learning hidden markov model n.
Skip this Video
Loading SlideShow in 5 Seconds..
Machine Learning Hidden Markov Model PowerPoint Presentation
Download Presentation
Machine Learning Hidden Markov Model

Loading in 2 Seconds...

play fullscreen
1 / 26

Machine Learning Hidden Markov Model - PowerPoint PPT Presentation

  • Uploaded on

Machine Learning Hidden Markov Model. Darshana Pathak University of North Carolina at Chapel Hill Research Seminar – November 14, 2012. Disclaimer. All the information in the following slides assumes that “There is a GREAT human mind behind every computer program.”.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Machine Learning Hidden Markov Model' - malory

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
machine learning hidden markov model

Machine Learning Hidden Markov Model


University of North Carolina at Chapel Hill

Research Seminar – November 14, 2012


All the information in the following slides assumes that “There is a GREAT human mind behind every computer program.”

what is machine learning
What is Machine Learning?
  • Make Computers learn from a given task and experience.
  • “Field of study that gives computers the ability to learn without being explicitly programmed”.
                  • Arthur Samuel (1959)
why machine learning
Why Machine Learning?
  • Human Learning is terribly slow! (?)
    • 6 years to start school, around 20 more years to become cognitive /computer scientist...
    • Linear programming, calculus, Gaussian models, optimization techniques and so on…
why machine learning1
Why Machine Learning?
  • No copy process in human beings - ‘one-trial learning’ in computers.
  • Computers can be programmed to learn – Both human and computer programs make errors, error is predictable for computer, we can measure error.
some more reasons
Some more reasons…
  • Growing flood of electronic data – Machines can digest huge amounts of data which is not possible for human.
  • Supporting computational power is also growing!
  • Data mining – to help improve decisions
    • Medical records study for diagnosis
    • Speech/handwriting/face recognition
    • Autonomous driving, robots
important distinction
Important Distinction
  • Machine learning focuses on prediction, based on known properties learned from the training data.
  • Data mining focuses on the discovery of (previously) unknownproperties on the data.
  • Example: Purchase history/behavior of a customer.


Markov Model

hidden markov model hmm
Hidden Markov Model -HMM
  • A Markov model with hidden states.
  • Markov Model – Stochastic Model that assumes Markov property.
  • Stochastic model – A system with stochastic process (random process).
hmm stochastic model
HMM – Stochastic model
  • Stochastic process vs. Deterministic process.
    • SP is probabilistic counterpart of DP.
  • Examples:
    • Games involving dice and cards, coin toss.
    • Speech, audio, video signals
    • Brownian motion
    • Medical data of patients
    • Typing behavior (Related to my project)
hmm markov model
HMM – Markov Model
  • Markov Model – Stochastic Model that assumes Markov property.
  • Markov property  Memory-less property
    • Future states of the process depend only upon the present state,
    • And not on the sequence of events that preceded it.
funny example of m arkov chain
Funny example of Markov chain
  • 0 – Home; 4 – Destination
  • 1,2,3 corners;
hidden markov model hmm1
Hidden Markov Model - HMM
  • A Markov model with hidden states – Partially observable system.
  • Markov process is hidden, we can see sequence of output symbols (observations).
hmm simple example
HMM: Simple Example
  • Determine the average annual temperature at a particular location over a series of years (Past when thermometers were not invented).
  • 2annual temperatures, Hot – H and Cold - C.
  • A correlation between the size of tree growth rings and temperature.
  • We can observe Tree ring size.
  • Temperature is unobserved – hidden.
hmm formation of problem
HMM – Formation of problem
  • 2 hidden states – H and C
  • 3 observed states – tree ring sizes.

Small – S, Medium – M, Large – L.

  • The transition probabilities, observation matrix and initial state distribution.
  • All matrices are row stochastic.
hmm formation of problem1
HMM – Formation of problem
  • Consider a 4 year sequence.
  • We observe the series of tree rings S;M; S; L.

O = (0, 1, 0, 2)

  • We need to determine temperature (H or C) for these 4 years i. e. Most likely state sequence of Markov process given observations.
hmm formation of problem2
HMM – Formation of problem
  • X = (x0, x1, x2, x3)
  • O = (O0, O1, O2, O3)
  • A = State transition probability (aij)
  • B = Observation probability matrix (bij)
hmm formation of problem3
HMM – Formation of problem
  • aij = P(state qjat t + 1 | state qiat t)
  • Bj(k) = P(observation k at t | state qj at t)
  • P(X) = πx0 * bx0(O0) * ax0,x1 * bx1(O1) * ax1,x2 * bx2(O2) * ax2,x3bx3(O3)
  • P(HHCC) = 0.6(0.1)(0.7)(0.4)(0.3)(0.7)(0.6)(0.1) = 0.000212
applying hmm to error generation
Applying HMM to Error Generation
  • Erroneous data in real-world data sets
  • Typing errors are very common.
    • Insertion
    • Deletion
    • Replace
  • Is there any way to determine most probable sequence or patterns of errors made by typist?
applying hmm to error generation1
Applying HMM to Error Generation
  • Examples:





applying hmm to error generation2
Applying HMM to Error Generation
  • Sequence of characters/Alignment Problem
hmm error generation
HMM & Error Generation
  • Hidden states: Pointer positions
  • Observations: Output character sequence
  • Problems:
    • Finding Path - Given an input, output character sequence and HMM model, determine most probable operation sequence?
    • Training - Given n pairs of input and output sequences, what is the model that maximizes probability of output?
    • Likelihood - Given input, output and the model, determine likelihood of observed sequence.
  • Why should machines learn? – Herbert A. Simon, Department of Computer Science and Psychology, Carnegie-Mellon University, C.I.P. # 425
  • A Revealing Introduction to Hidden Markov Models – Mark Stamp, Department of Computer Science, San Jose State University