1 / 17

Multiple alignment using hidden Markove models

Multiple alignment using hidden Markove models. November 21, 2001 Kim Hye Jin Intelligent Multimedia Lab marisan@postech.ac.kr. Outline. Introduction Methods and algorithm Result Discussion. IM lab. Introduction| why HMM. Introduction. Why HMM?

anana
Download Presentation

Multiple alignment using hidden Markove models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multiple alignment using hidden Markove models November 21, 2001 Kim Hye Jin Intelligent Multimedia Lab marisan@postech.ac.kr

  2. Outline • Introduction • Methods and algorithm • Result • Discussion IM lab

  3. Introduction| why HMM Introduction • Why HMM? • Mathematically consistent description of insertions and deletions • Theoretical insight into the difficulties of combining disparate forms of information Ex) sequences / 3D structures • Possible to train models from initially unaligned sequences IM lab

  4. Methods and algorithms|HMMs Methods and algorithms • State transition • State sequence is a 1st order Markov chain • Each state is hidden • match/Insert/delete state • Symbol emission States transition Symbol emission IM lab

  5. Methods and algorithms|HMMs Deletion state Match state Insertion state IM lab

  6. Methods and algorithms|HMMs Methods and algorithms • Replacing arbitrary scores with probabilities relative to consensus • Model M consists of N states S1…SN. • Observe sequence O consists of T symbols • O1… ON from an alphabet x • aij : a transition from Si to Sj • bj(x) : emission probabilities for emission of a symbol x from each state Sj IM lab

  7. Methods and algorithms|HMMs Methods and algorithms • Model of HMM : example of ACCY IM lab

  8. Methods and algorithms|HMMs Methods and algorithms • Forward algorithm • - a sum rather than a maximum IM lab

  9. Methods and algorithms|HMMs Methods and algorithms • Viterbi algorithm • the most likely path through the model • following the back pointers IM lab

  10. Methods and algorithms|HMMs Methods and algorithms • Baum-Welch algorithm • A variation of the forward algorithm • Reasonable guess for initial model and then calculates a score for each sequence in the training set using EM algorithms • Local optima problem: • forward algorithm /Viterbi algorithm • Baum-welch algorithm IM lab

  11. Methods and algorithms|HMMs Methods and algorithms • Simulated annealing • support global suboptimal • kT = 0 : standard Viterbi training procesure • kT goes down while in training IM lab

  12. Methods and algorithms|HMMs Methods and algorithms ClustalW IM lab

  13. Methods and algorithms|HMMs Methods and algorithms ClustalX IM lab

  14. Results Results • len : consensus length of the alignment • ali : the # structurally aligned sequences • %id: the percentage sequence identity • Homo: the # homologues identified in and extraced from SwissProt 30 • %id : the average percentage sequence identity in the set of homologues IM lab

  15. Results Results IM lab

  16. Discussion Discussion • HMM • a consistent theory for insertion and deletion penality • EGF : fairly difficult alignments are well done • ClusterW • progressive alignment • Disparaties between the sequence identity of the structures and the sequence identity of the homologoues • Large non-correlation between score and quality IM lab

  17. Discussion Discussion • The ability of HMM to sensitive fold recognition is apparent IM lab

More Related