130 likes | 242 Views
This presentation by Santosh Kumar Kodicherla explores the application of Hidden Markov Models (HMM) in progressive multiple sequence alignment. Key applications of HMM include optimal alignment in biological sequences and decision trees. The presentation covers the algorithm for pairwise alignment, constructing parent-child sequences, and the probabilistic alignment process. Significant algorithms like Viterbi and recursive backtracking for generating multiple alignments are also discussed. The emphasis is on developing a robust framework for analyzing sequence similarities while addressing gaps and alignment efficiency.
E N D
A Hidden Markov model for progressive multiple alignment -Ari Loytynoja and Michel C.Milinkovitch Presnted by Santosh Kumar Kodicherla
HMM Applications • Hidden Markov Model is used to find optimal value in many applications like: 1. In Membrane Helix 2. In finding a dice whether its Fair dice or not. 3.Decesion tree applications, Neural Networks etc.
Working of HMM for Simple Pair wise Alignment • We check The two sequences and built the unknown parent. (Similarity is maximum). • This forms the basis for Current Algorithm. Parent Seq1 Seq2
Alignments • Pairwise Alignment PDGIVTSIGSNLTIACRVS PPLASSSLGATIRLSCTLS Multiple Alignment DREIYGAVGSQVTLHCSFW TQDERKLLHTTASLRCSLK PAWLTVSEGANATFTCSLS LPDWTVQNGKNLTLQCFAD LDKKEAIQGGIVRVNCSVP SSFTHLDQGERLNLSCSIP DAQFEVIKGQTIEVRCESI LSSKVVESGEDIVLQCAVN PAVFKDNPTEDVEYCCVAD
Systems and Models • Building Multiple alignment with Decreasing Similarity. • Compute probabilistic alignment • Keep Track of child pointers. • For each site Vector probabilities of alternate characters A/C/G/T/- is calculated. • New node generated is aligned with another internal sequence and cont. • Once root node is defined for multiple alignments ,we use recursive back tracking to generate multiple alignments.
Substitution Model • Consider Seqx, Seqy- generate Seqz(Parent) Terms: Pa(Xi) –Probability Seq Xi has character ‘ a ‘. If a char is observed it is given a prob=1. Character ‘a’ has a background probability qa a Evolves b, this represented as Sab. Comparing characters, Substitution. GAP: Pxi,yi= represents prob. Xi,Yi are aligned and generate Zi. For all the character states ‘a’ in Zk- • pxi ,y j= pzk (xi , y j ) =∑pzk=a(xi , y j ). • pzk=a(xi , y j ) = qa ∑b sab pb(xi ) ∑b sab pb(y j )
Steps in Algorithm: • Look back HM Model. • Pair wise alignment • Calculate Posterior Probability. • Multiple Alignment • Testing Algorithm
Look back HM Model • Defines 3 states, Match M, x-insert ,y-insert. -Calculate probabilities of Moving from M to X or Y represented as δ. -Probability to stay at insert ‘ε ‘. -Probability to move back to M.
Pair wise alignment : • In Dynamic prog, we define matrix and makes recursive calls, by choosing best path. • Use Backtracking to find the best path. • Veterbi path to get the best alignment path. • Used to find the parent vector which represents both childs.
Multiple Alignment Observations. • The pair wise algorithm works progressively from tip of the node to root of tree. • Once root node is defined multiple alignments can be generated. • If a gap is introduced in the process , the recursive call does not proceed. • At a given column most of sequences are well aligned except few which may contain Gaps.