Hidden markov models
Download
1 / 13

Hidden Markov Models - PowerPoint PPT Presentation


  • 84 Views
  • Uploaded on

Hidden Markov Models. Pairwise Alignments. Hidden Markov Models. Finite state automata with multiple states as a convenient description of complex dynamic programming algorithms for pairwise alignment

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Hidden Markov Models' - palmer-santana


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Hidden markov models

Hidden Markov Models

Pairwise Alignments


Hidden markov models1
Hidden Markov Models

  • Finite state automata with multiple states as a convenient description of complex dynamic programming algorithms for pairwise alignment

  • Basis for a probabilistic modelling of the gapped alignment process by converting the FSA into HMM

  • Advantages:

    1) use resulting probabilistic model to explore reliability of the alignment and explore alternative alignments

    2) weighting all alternative alignments probabilistically yields scores of similarity independent of any specific alignment


Hidden markov models2

X

qxi

ε

δδ

τ

δ

1-ε -τ

τ

M

pxiyj

E

B

1-2δ - τ

1-ε -τ

τ

δ

δ

Y

qyj

ε

Hidden Markov Models


Hidden markov models3
Hidden Markov Models

  • Pair Hidden Markov Models generate an aligned pair of sequences

  • Start in the Begin state B and cycle over the following two steps:

    1) pick the next state according to the transition probability distributions leaving the current state

    2) pick a symbol pair to be added to the alignment according to the emission probability distribution in the new state

  • Stop when a transition into the End state E is made


Hidden markov models4
Hidden Markov Models

  • State M has emission probability distribution pab for emitting an aligned pair a:b

  • States X ynd Y have distributions qxi for emitting symbol xi from sequence x against a gap

  • The transition probability from M to an insert state X or Y is denoted δ and the probability of staying in an insert state by ε

  • The probability for transition into an end state is denoted τ

  • All algorithms discussed so far carry across to pair HMMs

  • The total probability of generating a particular alignment is just the product of the probabilities of each individual step.


Hidden markov models5
Hidden Markov Models

Viterbi Algorithm for pair HMMs

  • Initialisation:

  • Recurrence:

  • Termination:


Hidden markov models6

1-η

η

η

X

qxi

Y

qyj

η

1-η

1-η

E

B

η

1-η

Hidden Markov Models

probabilistic model for a random alignment


Hidden markov model
Hidden Markov Model

  • The main states X and Y emit the two sequences independently

  • The silent state does not emit any symbol but gathers input from the X and Begin states

  • The probability of a pair of sequences according to the random model is


Hidden markov model1
Hidden Markov Model

  • Allocate the terms in this expression to those that make up the probability of the Viterbi alignment, so that the log-odds ratio is the sum of the individual log-odds terms

  • Allocate one factor of (1-η) and the corresponding qa factor to each residue that is emitted in a Viterbi step

  • So the match transitions will be allocated (1-η)2qaqb where a and b are the residues matched

  • The insert states will be allocated (1-η)qa where a is the residue inserted

  • As the Viterbi path must account for all residues, exactly (n+m) terms will be used


Hidden markov model2
Hidden Markov Model

  • We can now compute in terms of an additive model with log-odds emission scores and log-odds transition scores.

  • In practice this is the most practical way to implement pair HMMs

  • Merge the emission scores with the transitions to produce scores that correspond to the standard terms used in sequence alignment by dynamic programming

  • Now the log-odds version of the Viterbi alignment algorithm can be given in a form that looks like standard pairwise dynamic programming



Hidden markov model3
Hidden Markov Model

Optimal log-odds alignment

  • Initialisation:

  • Recursion:

  • Termination:


Hidden markov model4
Hidden Markov Model

  • The constant c in the termination has the value

  • The procedure shows how for any pair HMM we can derive an equivalent finite state automaton for obtaining the most probable alignment


ad