- 80 Views
- Uploaded on
- Presentation posted in: General

Patterns, Profiles, and Multiple Alignment

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Patterns, Profiles, and

Multiple Alignment

- Profiles and Sequence Logos
- Profile Hidden Markov Models
- Aligning Profiles
- Multiple Sequence Alignments by Gradual Sequence Adition
- Other Ways of Obtaining Multiple Alignments
- Sequence Pattern Discovery

- Profiles and Sequence Logos
- Profile Hidden Markov Models
- Aligning Profiles
- Multiple Sequence Alignments by Gradual Sequence Adition
- Other Ways of Obtaining Multiple Alignments
- Sequence Pattern Discovery

Hidden Markow Models:

A hidden Markov model (HMM) is a statisticsl model,

in which the system being modeled is assumed to be a Markov process (Memoryless process: its future and past are independent),

with hidden states.

- Hidden Markow Models:
- Has a set of states each of which has limited number of transitions and emissions,
- Each transition between states has an assisgned probability,
- Each model strarts from start state and ends in end state,

Hidden Markow Models parameters:

A set of finite number of states, Si,

The transition probability from state Si to Sj, aij,

The emission probability density of a symbol ω

in state Si

Hidden Markow Models parameters:

Firstly discuss:

Morkov Models,

Markov Assumption

Markow Models and Assumption (cont.):

To understand HMMs:

Talk about weather,

Assume there are three types of weather:

Sunny,

Rainy,

Foggy.

Assume weather does not change during the day (if it is sunny it will sunny all the day)

Markow Models and Assumption (cont.): Weather prediction is about the what would be the weather tomorrow,

Based on the observations on the past.

Markow Models and Assumption (cont.): Weather at day n is

qn depends on the known weathers of the past days (qn-1, qn-2,…)

Markow Models and Assumption (cont.): We want to find that:

means given the past weathers what is the probability of any possible weather of today.

Markow Models and Assumption (cont.): For example:

if we knew the weather for last three days was:

the probability that tomorrow would be is:

P(q4 = | q3 = , q2 = , q1 = )

Markow Models and Assumption (cont.):

For example:

this probability could be infered from the statistics of past observations

the problem: the larger n is, the more observations we must collect.

for example: if n = 6 we need 3(6-1) = 243 past observations.

Markow Models and Assumption (cont.):

Therefore, make a simplifying assumption Markov assumption:

For sequence:

the weather of tomorrow only depends on today (first order Markov model)

Markow Models and Assumption (cont.): Examples:

The probabilities table:

Markow Models and Assumption (cont.): Examples:

HMM:

Markow Models and Assumption (cont.): Examples:

Given that day the weather is sunny, what is the probability that tomorrow is sunny and the next day is rainy ?

Markov assumption

Markow Models and Assumption (cont.): Examples:

If the weather yesterday was rainy and today is foggy what is the probability that tomorrow it will be sunny?

Markow Models and Assumption (cont.):

Examples:

If the weather yesterday was rainy and today is foggy what is the probability that tomorrow it will be sunny?

Markov assumption

Hidden Markov Models (HMMs):

What is HMM:

Suppose that you are locked in a room for several days,

you try to predict the weather outside,

The only piece of evidence you have is whether the person who comes into the room bringing your daily meal is carrying an umbrella or not.

Hidden Markov Models (HMMs):

What is HMM (cont.):

assume probabilities as seen in the table:

Hidden Markov Models (HMMs):

What is HMM (cont.):

Now the actual weather is hidden from you.

You can not directly see what is the weather.

Hidden Markov Models (HMMs):

What is HMM (cont.):

Finding the probability of a certain weather

is based on the observations xi:

Hidden Markov Models (HMMs):

What is HMM (cont.):

Using Bayes rule:

For n days:

Hidden Markov Models (HMMs):

What is HMM (cont.):

We can omit So:

With Markov assumptions:

Hidden Markov Models (HMMs):

Examples:

Suppose the day you were locked in it was sunny. The next day, the caretaker carried an umbrella into the room.

You would like to know, what the weather was like on this second day.

Hidden Markov Models (HMMs):

Examples:

Calculate 3 probabilities:

Hidden Markov Models (HMMs):

Examples:

Consider the event with highest value. It is most likely to happen.

Hidden Markov Models (HMMs):

Another Examples:

Suppose you do not know how the weather was when your were locked in. The following three daysthe caretaker always comes without an umbrella. Calculate the likelihood for the weather on thesethree days to have been

Hidden Markov Models (HMMs):

Another Examples:

As you do not know how the weather is on theﬁrst day, you assume the 3 weather situations are equi-probable on this dayand the prior probability for sun on day one is therefore

Hidden Markov Models (HMMs):

Another Examples:

Assumption:

Hidden Markov Models:

Another Examples:

HMMs to represent a family of sequences

Given a multiple alignment of sequences, wecan use an HMM to model the sequences.

Each column of the alignment may be represented by a hidden state that produced represented by a hidden state that produced that column.

Insertions and deletions can be represented by other states

HMMs to represent a family of sequences

HMMs to represent a family of sequences

http://www.ifm.liu.se/bioinfo/assignments/hmm-profile.png

HMMs to represent a family of sequences

http://www.ifm.liu.se/bioinfo/assignments

Determining the states of the HMM

The structure is usually fixedand only the number of “match” states is to be determined

Determining the states of the HMM

An alignment column with no gaps can be considered as a “match” state considered as a match state.

An alignment column with a majority of gapscan be considered an “insert” state can be considered an insert state.

- Determining the transition probabilities
- From a stateuthetransitiontoanotherstate v is representedbyt(u.v).
- Thesummationoverallstates w thatareconnectedtostate u bytransitiongivesone:
- The transition probabilities from a state (excepttheendstate) always add up to 1.

- Determining the transition probabilities

- Determining the transition probabilities
- mu,v: number of transitionsfromstateutostate v
- Transitionprobabilities t(u,v) can be estimated:

- Determining the emission probabilities
- Emission probabilities in a match or insertstate also adds up to 1.

- Determining the emission probabilities
- eMu: emissionprobabilityfor a residuefromthe u thmatchstate,

- Determining the emission probabilities
- eIu: emissionprobabilityfor a residuefromthe u thinsertstate,
- Theprobabilitiesareusuallytakenfromtheoveralaminoacidcomparison of a selected data set.

HMMs examples:

HMMs examples:

HMMs examples:

- 5 transitions in gap region:
- C out,
- G out
- AC,
- CT,
- T out
- Out transition 3/5
- Stay transition 2/5

gap region

Scoring a sequence against a profile HMM

Given a profile HMM, any given path through the model will emit a sequence with an associated probability,

The path probability is the product of all transition and emission probabilities along thepath.

Scoring a sequence against a profile HMM

Viterbi algorithm:

Given a query sequence we can compute the most probable path that will emit that query sequence.

Scoring a sequence against a profile HMM

Viterbi algorithm:

Another interesting question: What is the probability that a given sequence can be generated by the hidden Markov model

Solution:Calculated by summing over all possible path giving rise to a given sequence

Scoring a sequence against a profile HMM

Viterbi algorithm:

Will be applied to profiles HMM similar to:

Scoring a sequence against a profile HMM

Viterbi algorithm:

At a profile position u:

Mu : Match state,

Iu : Insertion state,

Du : Delete state,

t(Mu, Mu+1) : transition probability from state Mu to Mu+1

Scoring a sequence against a profile HMM

Viterbi algorithm:

During the algorithm:

A record must be kept of the highest probability up to that point in the model and for a given amount of emited sequences.

Scoring a sequence against a profile HMM

Viterbi algorithm:

During the algorithm:

When the sequence up to and including residue xi has been emited, the highest probability will be written as VDu(xi)

Scoring a sequence against a profile HMM

Viterbi algorithm:

Dynamic Programming:

COLUMNS:

NUMBER OF STATES IN HMM

ROWS:

LENGTH OF THE QUERY OR EMITTED SEQUENCE

Scoring a sequence against a profile HMM

Viterbi algorithm:

Recurrence relations:

Scoring a sequence against a profile HMM

Viterbi algorithm:

Recurrence relations (log):

Scoring a sequence against a profile HMM

Viterbi algorithm (EXAMPLE):

Scoring a sequence against a profile HMM

Viterbi algorithm (EXAMPLE):

Start Probabilities:

Emission Probabilities:

Scoring a sequence against a profile HMM

Viterbi algorithm (EXAMPLE):

Scoring a sequence against a profile HMM

Viterbi algorithm (EXAMPLE):

GGCTGATT

- Scoring a sequence against a profile HMM
- Viterbialgorithm (EXAMPLE):

G G C T GA T T

BeginM1M2I3I3I3I3I3M3End

- M. Zvelebil, J. O. Baum, “Understanding Bioinformatics”, 2008, Garland Science
- Andreas D. Baxevanis, B.F. Francis Ouellette, “Bioinformatics: A practical guide to the analysis of genes and proteins”, 2001, Wiley.
- Barbara Resch, “Hidden Markov Models - A Tutorial for the Course Computational Intelligence”, 2010.