Download Presentation
## Phylogenetic Trees (2) Lecture 12

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -

**Phylogenetic Trees (2)Lecture 12**Based on: Durbin et al Section 7.3, 7.8, Gusfield: Algorithms on Strings, Trees, and Sequences Section 17. .**Recall: The Four Points Condition**Theorem: A set M of L objectsis additive iff any subset of four objects can be labeled i,j,k,l so that: d(i,k) + d(j,l) = d(i,l) +d(k,j) ≥ d(i,j) + d(k,l) We call {{i,j},{k,l}} the “split” of {i,j,k,l}. The four point condition doesn’t provides an algorithm to construct a tree from distance matrix, or to decide that there is no such tree (ie, that the set is not additive). The first methods for constructing trees for additive sets used neighbor joining methods:**Constructing additive trees:The neighbor joining problem**• Let i, j be neighboring leaves in a tree, let k be their parent, and let m be any other vertex. • The formula • shows that we can compute the distances of k to all other leaves. This suggest the following method to construct tree from a distance matrix: • Find neighboring leaves i,j in the tree, • Replace i,j by their parent k and recursively construct a tree T for the smaller set. • Add i,j as children of k in T.**A**B C D Neighbor Finding How can we find from distances alone a pair of nodes which are neighboring leaves? Closest nodes aren’t necessarily neighboring leaves. Next we show one way to find neighbors from distances.**T1**T2 m l k i j Neighbor Finding: Seitou&Nei method Theorem (Saitou&Nei)Assume all edge weights are positive. If D(i,j) is minimal (among all pairs of leaves), then i and j are neighboring leaves in the tree. The proof is rather involved, and will be skipped.**A**B e2 e3 e1 C D Saitou&Nei proof (to be skipped) Notations used in the proof p(i,j) = the path from vertex i to vertex j; P(D,C) = (e1,e2,e3) = (D,E,F,C) For a vertex i, and an edge e=(i’,j’): Ni(e) = |{k : e is on p(i,k)}|. ND(e1) = 3, ND(e2) = 2, ND(e3) = 1 NC(e1) = 1 F E**Saitou&Nei proof**Rest of T l k i j**T1**T2 l k i j Saitou&Nei proof Proof of Theorem: Assume for contradiction that D(i,j) is minimized for i,j which are not neighboring leaves. Let (i,l,...,k,j) be the path from i to j. let T1 and T2 be the subtrees rooted at k and l which do not contain edges from P(i,j) (see figure). Notation: |T| = #(leaves in T).**T2**m l k i j Saitou&Nei proof Case 1: i or j has a neighboring leaf. WLOG j has a neighbor leaf m. A. D(i,j) - D(m,j)=(L-2)(d(i,j) - d(j,m)) – (ri+rj) +(rm+ rj) =(L-2)(d(i,k)-d(k,m))+rm-ri B. rm-ri ≥ (L-2)(d(k,m)-d(i,l)) + (4-L)d(k,l) (since for each edge eP(k,l), Nm(e) ≥ 2 and Ni(e) L-2) Substituting B in A: D(i,j) - D(m,j) ≥ (L-2)(d(i,k)-d(i,l)) + (4-L)d(k,l) = 2d(k,l)> 0, contradicting the minimality assumption.**T1**m n p T2 k l i j Saitou&Nei proof Case 2: Not case 1. Then both T1andT2contain 2 neighboring leaves. WLOG |T2|≥|T1|. Let n,m be neighboring leaves in T1. We shall prove that D(m,n) < D(i,j), which will again contradict the minimality assumption.**Saitou&Nei proof**A. 0 ≤ D(m,n) - D(i,j)= (L-2)(d(m,n) - d(i,j) ) + (ri+rj) – (rm+rn) B. rj-rm< (L-2)(d(j,k) – d(m,p)) + (|T1|-|T2|)d(k,p) C. ri-rn <(L-2)(d(i,k) – d(n,p)) + (|T1|-|T2|)d(l,p) Adding B and C, noting that d(l,p)>d(k,p): D. (ri+rj) – (rm+rn) < (L-2)(d(i,j)-d(n,m)) + 2(|T1|-|T2|)d(k,p) T1 m n p T2 k Substituting D in the right hand side of A: D(m,n ) - D(i,j)< 2(|T1|-|T2|)d(k,p) ≤ 0, as claimed. l i j**A simpler neighbor finding method:**Select an arbitrary node r. • For each pair of labeled nodes (i,j)let C(i,j) be defined by the following figure: r C(i,j) j Claim (from final exam, Winter 02-3): Let i, j be such that C(i,j)is maximized. Then i and j are neighboring leaves. i**m**k i j Neighbor Joining Algorithm • Set M to contain all leaves, and select a root r. |M|=L • If L =2, return tree of two vertices Iteration: • Choose i,j such that C(i,j) is maximal • Create new vertex k, and set • remove i,j, and add k to M • Recursively construct a tree on the smaller set, then add i,j as children on k, at distances d(i,k) and d(j,k).**m**k i j Complexity of Neighbor Joining Algorithm Naive Implementation: Initialization:θ(L2) to compute the C(i,j)’s. Each Iteration: • O(L) to update {C(i,k):i L} for the new node k. • O(L2) to find the maximal C(i,j). Total of O(L3).**Complexity of Neighbor Joining Algorithm**Using Heap to store the C(i,j)’s: Initialization:θ(L2) to compute and heapify the C(i,j)’s. Each Iteration: • O(1) to find the maximal C(i,j). • O(L logL) to delete {C(m,i), C(m,j)} and add C(m,k) for all vertices m. Total of O(L2log L). (implementation details are omitted)**8**3 5 B 3 C D A E Ultrametric trees A more recent (and more efficient) way for constructing and identifying additive trees. Idea: Reduce the problem to constructing trees by the “heights” of the internal nodes. For leaves i,j, D(i,j) represent the “height” of the common ancestor of i and j.**8**3 5 B 3 C D A E Ultrametric Trees Definition: T is an ultrametric tree for a symmetric positive real matrix D if: • The leaves of T correspond to the rows&columns of D • Internals nodes have at least two sons, and the Least Common Ancestor of i and j is labeled by D(i,j). • The labels decrease along paths from root to leaves**Centrality of Ultrametric Trees**We will study later the following question: Given a symmetric positive real matrix D, Is there an ultrametric tree T for D? But first we show that algorithm that constructs ultrametric trees from a matrix (or decides that no such tree exists) can be used to construct trees for additive sets and other related problems.**Transforming Ultrametric Trees to Weighted Trees**Use the labels to define weights for all internal edges in the natural way. For this, consider the labels of leaves to be 0. We get an additive ultrametric tree whose height is the label of the root. 8 3 5 5 3 3 2 Note that in this tree all leaves are at the same height. This is why it is called ultrametric. 3 B 5 3 C 3 3 D A E**Transforming Weighted Trees to Ultrametric Trees**A weighted Tree T can be transformed to an ultrametric tree T’ as follows: Step 1: Pick a node k as a root, and “hang” the tree at k. a c a 2 k=a 2 4 1 3 3 b 2 1 b d 2 4 c d**9**a 2 7 1 3 b 4 2 4 c d Transforming Weighted Trees to Ultrametric Trees Step 2: Let M = maxid(i,k). M is taken to be the height of T’. Label the root by M, and label each internal node j by M-d(k,j). c a k=a, M=9 2 4 3 2 1 b d**9**2 9 7 a 7 3 b 4 4 4 c d Transforming Weighted Trees to Ultrametric Trees Step 3: “Stretch” edges of leaves so that they are all at distance M from the root 9 (9) a 2 7 M=9 1 3 (6) b 4 2 4 c d (0) (2)**Reconstructing Weighted Trees from Ultrametric Trees**Weight of an internal edge is the difference between its endpoints. Weights of an edge to leaf i is obtained by substracting M-d(k,i) from its current weight. 9 2 0 9 (-9) 7 a 1 a 3 7(-6) b b 4 4 2 4 4(-2) c d M = 9 c d**Solving the Additive Tree Problem by the Ultrametric**Problem: Outline • We solve the additive tree problem by reducing it to the ultrametric problem as follows: • Given an input matrix D = D(i,j) of distances, transform it to a matrix D’= D’(i,j), where D’(i,j) is the height of the LCA of i and j in the corresponding ultrametric tree T’. • Construct the ultrametric tree, T’, for D’. • Reconstruct the additive tree T from T’.**How D’ is constructed from D**D’(i,j) should be the height of the Least Common Ancestror of i and j in T’, the ultrametric tree hanged at k: Thus, D’(i,j) = M - d(k,m), where d(k,m) is computed by: 9 a 2 7 1 3 b 2 4 c d**9**a 2 1 3 b 2 4 d c The transformation of D to D’ M=9 9 T’ T 7 a 4 b c d Distance matrix D Ultrametric matrix D’**Identifying Ultrametric Trees**Definition: A symmetric matrix D is ultrametric iff for each 3 indices i, j, k D(i,j) ≤ max {D(i,k),D(j,k)}. (ie, there is a tie for the maximum value) Theorem: D has an ultrametric tree iff it is ultrametric Proof: Next lecture.