1 / 38

Gibbs measures on trees

Gibbs measures on trees. Elchanan Mossel, U.C. Berkeley mossel@stat.berkeley.edu , http://www.cs.berkeley.edu/~mossel/. Lecture Plan. Gibbs Measures on Trees: Uniqueness Reconstruction Mixing times on trees Building Trees ( Phylogeny ) Some analytical problems.

Download Presentation

Gibbs measures on trees

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Gibbs measures on trees Elchanan Mossel, U.C. Berkeley mossel@stat.berkeley.edu, http://www.cs.berkeley.edu/~mossel/

  2. Lecture Plan • Gibbs Measures on Trees: • Uniqueness • Reconstruction • Mixing times on trees • Building Trees (Phylogeny) • Some analytical problems. • Gibbs Measures on Trees and Other Graphs • Uniqueness • Mixing Times. • Belief Propagation. • The Replica Method.

  3. Gibbs Measures • A Gibbs Measure on a (finite) graph G=(V,E) is given by • Node potentials (v : v 2 V) and • Edge Potentials (e : e 2 E) • The probability of  = ((v) : v 2 V) 2 A|V| is given by • P[] = Z-1 £ v 2 Vv[(v)] £ e=(v,u) 2 Ee[(v),(u)] G • Gibbs measures introduced in Statistical Physics. • Essential in Machine Learning. • Also known as MRF’s, Graphical Models etc.

  4. Uniqueness and Reconstruction • Let (v,L) := ((w) : d(v,w) = L). • Let (v,L)(a) := P[ (v) = a | (v,L)] – P[(v) = a] • Let Gn be a family of Gibbs measures: • Uniqueness := limL !1 sup { |(v,L)|1 : v 2 Gn} = 0 • Reconstruction := limL !1 sup { |(v,L)|2 : v 2 Gn}  0 • Informally: • Uniqueness := 8 values of (v,L >> 1), (v) has same dist. • Reocn. := (v) is typically independent of (v,L >> 1) (v,L) (v) L G

  5. Gibbs measures on trees • On a finite tree, a Gibbs measure P can be written as: • Using recursions easy to calculate: P[(v) = . | (v,L)] • )Easy to determine uniqueness when extreme(v,L) are known (Ising, Potts, Independent sets …) • Open Problem 1: Given the d-ary tree and a general M, determine uniqueness. • Open Problem 2: Convex asymptotic geometry of P[(v) = . | (v,L)] as L !1 P[] = [(0)] £{e = u ! v} Me(u),(v) + 0 + + + - + + - + - + - + + • Assume Me are identical. • Tree is d-ary tree.

  6. Gibbs measures on trees – a story • Let Mi,j = P[hair(daughter) = j | hair(mother) = i] • Suppose we know the tree T of all mothers going back to Eve. • “Uniqueness”: Is there any assignment of hair color to current population that will yield information on Eve’s? • “Reconstruction”: Do we expect to have information on Eve’s hair color from current population?

  7. Reconstruction: Recursive Reconstruction T =3-ary regular tree withMe = Mfor all edges. Consider the recursive majority function. = Binary symmetric channel (BSC) = Ising model (no external field) • Let pn := P[ n-fold rec-maj((0,n)) = (0) ] . • Let (p) = (1-) p +  (1-p) and g(p) = 3(p)+32(p)(1-(p)) • p0 = 1 and pn+1 = g(pn) ) pn! ½ if and only if (1-2) > 2/3. • )Reconstruction if  < 1/6. • Von-Neumann(56) forreliable noisy-computation. • Later: Evans-Schulmann93, Steel94, Mossel98.

  8. Spectral Reconstruction Let M be the Ising (BSC) model on a b-ary tree T. Let f(n) = Maj(n) = sign({(v) : v 2 Ln}). Theorem (Higuchi 77): limn P[0 = f(n)] > ½ if b(1-2)2 > 1. )Reconstruction for ternary tree if < ½ - (1/3)1/2. Let M be any chain and T the b-ary tree Let  be the 2nd eigenvalue of M in absolute value. Claim[Kesten-Stigum66] b |  |2 > 1 )Reconstruction. b |  |2 =1 is also threshold for census [MosselPeres04] and robust [Janson-Mossel04] reconstruction.

  9. Non Reconstruction - Coupling • Copying rule. For i =+,-: • P[i ! i] =  = 1 – 2  • P[i ! Uniform] = 1– = 2  • Continuing down the tree, non-coupled elements form a branching process with parameter . + / - + / - = = + / - = = = = = = = = = = • If 2 · 1, branching process dies)coupling. • )for ¸ ¼ no reconstruction (this is not tight!) • The threshold for reconstruction is only for Ising (BSC) model is given by 22 = 1.

  10. Ising Model on Binary Trees low interm. high bias bias no bias “+” boundary “+” boundary no bias bias 2  > 1 22 < 1 “typical” boundary “typical” boundary 2 2 > 1 2  < 1 Unique Gibbs measure The transition at 2 2 = 1 was proved by: Bleher-Ruiz-Zagrebnov95,Evans-Kenyon-Peres-Schulman2000,Ioffe99, Kenyon-Mossel-Peres-2001,Martinelli-Sinclair-Weitz2004.

  11. Reconstruction for Markov models • So the threshold b 2 = 1 is important. • But [M-2000] it is not the threshold for extemality • Not even for 2 £ 2 markov chains. • Open: What is the threshold for q=3 Potts on binary tree? • Very Recent[Borgs-Chayes-M-Roch]: b 2 =1 for slightly asymmetric channels.

  12. Lecture Plan • Gibbs Measures on Trees: • Uniqueness • Reconstruction • Mixing times on trees • Building Trees (Phylogeny) • Some analytical problems. • Gibbs Measures on Trees and Other Graphs • Uniqueness • Mixing Times. • Belief Propagation. • The Replica Method.

  13. Glauber dynamics: sampling Gibbs measures Consider the following dynamics on configuration  of Gibbs measure G. At rate 1: Pick a vertex v uniformly at random, and update σ(v) according to the conditional probability given {σ(w): w ~ v}. Easy: Converges to Gibbs distribution. Hard: How quickly? Measure convergence in terms of Markov Operator. G

  14. Ising Model on Binary Trees low interm. high bias bias no bias no bias bias 2  > 1 22 < 1 “typical” boundary “typical” boundary 2 2 > 1 2  < 1 Unique Gibbs measure 2 = (n1 + 2 log2) Reconstruction No-Reconstruction, 2 =O(1) In Berger-Kenyon-M-Peres05

  15. Relaxation time for the binary tree • On Trees: Fast mixing  No-Reconstruction. • Vs. Common wisdom: Fast mixing  Uniqueness. • Martinelli-Sinclair-Weitz05: • Log-Sob behaves in the same way as Spectral-Gap. • Study external-fields and boundary conditions …

  16. Lecture Plan • Gibbs Measures on Trees: • Uniqueness • Reconstruction • Mixing times on trees • Building Trees (Phylogeny) • Some analytical problems. • Gibbs Measures on Trees and Other Graphs • Uniqueness • Mixing Times. • Belief Propagation. • The Replica Method.

  17. Phylogeny • “Phylogeny is the true evolutionary relationships between groups of living things” Noah Shem Japheth Ham Cush Kannan Mizraim

  18. Pyhlogenetic Inference • In “phylogenetic inference” • The tree is unknown. • Given a sequence of collections of random variables at the leaves (“species”). • Collections are i.i.d. • Want to reconstruct the tree (un-rooted).

  19. Pyhlogenetic Reconstruction

  20. Markov Model of Evolution • Simplest evolution model: binary symmetric channel • CFN Model: • Tree: T = (V,E) • Node states: • Mutation probabilities: …001100011101000011000100… 0 s(r) pra prc 0 s(a) pab pa3 0 1 s(b) s(c) pc4 pc5 pb1 pb2 0 0 1 0 1 0: Purines (A,G) 1: Pyrimidines (C,T) s(1) s(2) s(3) s(4) s(5)

  21. Inference: Given: i.i.d. samples at the leaves Task: Reconstruct the model, i.e. find treeand do soefficiently Efficiency: 1) Computational: Running time of reconstruction algorithm 2) Information-theoretic: Sequence length required for successful reconstruction Let n = # leaves (species) k = length of sequences needed. Phylogenetic Inference Problem s(1) s(2) s(3) s(4) s(5) 1 1 1 0 0 0 0 0 1 1 1 1 0 0 1 0 0 1 1 1 1 0 0 1 1 pb2 prc + pra pa3 pc5

  22. Phylogeny: Conjectures and Results Reconstruction Phylogeny Reconstruction conj k = O(log n) No Reconstruction conj k = poly(n) Percolation critical = 1/2 Random Cluster MS03 Ising model critical :22 = 1 CFN Mo04 DMR05

  23. X=T ? ? L * k Known Known q-L * k Polynomial Lower Bound at High Mutations • Proof:

  24. Logarithmic Reconstruction • Th2 [M 2004]: If T is an tree on n leaves s.t. • For all e, min < (e)< maxand 22min> 1, max < 1. • Thenk = O(log n – log ) characters suffice to infer the topology with probability 1- . • Caveat: Need a balanced tree – all leaves at the same distance from a root. • Th3 [Daskalakis-M-Roch 2005] Above result holds for general trees. • + Cameron,Hill,Rao [2006]: Experimental performence.

  25. Balanced Trees • Two-Step Algorithm [M 2004]: • 1) Reconstruct one (or a few) level(s) – using distance estimation. • 2) Infer sequences at roots using recursive majority. • 3) Start over

  26. General Trees [Daskalakis, M, Roch, 2005]

  27. Lecture Plan • Gibbs Measures on Trees: • Uniqueness • Reconstruction • Mixing times on trees • Building Trees (Phylogeny) • Some analytical problems. • Gibbs Measures on Trees and Other Graphs • Uniqueness • Mixing Times. • Belief Propagation. • The Replica Method.

  28. Main analytical problems • How to analyze recursions of the random measures (,L)? • No general techniques are known (some easy methods follow). • Needed for • General boundary conditions: • Worst case (uniqueness) • Average case (Reconstruction) • Other. • non-regular trees (strong spatial mixing) and for • families of random trees (optimal error correcting codes).

  29. Lecture Plan • Gibbs Measures on Trees: • Uniqueness • Reconstrution • Mixing times on trees • Building Trees (Phylogeny) • Some analytical problems. • Gibbs Measures on Trees and Other Graphs • Uniqueness • Mixing Times. • Belief Propagation. • The Replica Method.

  30. Conjecture: Uniqueness on tree / graphs • Consider Gibbs measures where • All edge potentials are identical: e =  for all e • All node potentials are trivial : v = 1 for all v. • Graph is regular of degree d. • Conjecture: • Gibbs measure unique on d-regular tree) • Gibbs measure unique on any family of d regulargraphs. • Recently proved by Weitz for anti-ferromagnet Ising models. • Trivial for random graphs. G T

  31. Conjecture: Uniqueness on tree / graphs • Very Recently [M-Weitz-Wormald-06]: • For the hard-core model: • Non-uniqueness of Gibbs measure on 3-regular tree • ) • Exp. Slow mixing on random 3-regular graphs. • Reconstruction on random 3-regular graphs. • Moral: Slow/Rapid mixing on “typical” graphs is determined by uniqueness on trees. • Still don’t really know how to prove for • 4-regular graphs • Other models.

  32. Lecture Plan • Gibbs Measures on Trees: • Uniqueness • Reconstruction • Mixing times on trees • Building Trees (Phylogeny) • Some analytical problems. • Gibbs Measures on Trees and Other Graphs • Uniqueness • Mixing Times. • Belief Propagation. • The Replica Method.

  33. Belief Propagation in AI • Belief Propagation (BP) is a popular method in AI/Coding for estimating marginal probabilities P[(0) = a] for a Gibbs measure G. • It is equivalent [TatikondaJordan02] to calculating marginal probabilities P[(0) = a] on the computation tree,T(G). • In particular, uniqueness on infinite computation tree )convergence of BP. • Uniqueness + High girth) Convergence to correct marginals • Open “problem”: Is uniqueness needed? • Why BP works also when girth is small? G T

  34. Belief Propagation in Coding • In coding: • BP is used to decode Low Density Parity Check Codes [Gallager62] • Proved to be efficient without “uniqueness”[LMSS,RSU] • Recursive Analysis – up to girth of graph. • Open Problem: Is BP efficient beyond girth? • Open Problem: Can LDPC codes achieve Channel Capacity?

  35. Replica Symmetry Breaking in Physics • In Physics: • Replicas are recursive distributional equations used to calculate probabilities for spinglasses (random codes, random SAT problems). • Symmetric Replicas “” Belief Propogation. • Symmetry Breaking Replicas “” Survey Propogation. • [MezardMontanari06] Claim: Symmetry Breaks exactly when reconstruction emerges. • Open problem/Conjecture: Is the reconstruction threshold on d-ary tree the “right” threshold for spin-glasses on random d-regular graphs?

  36. Lecture Plan • Gibbs Measures on Trees: • Uniqueness • Reconstruction • Mixing times on trees • Building Trees (Phylogeny) • Some analytical problems. • Gibbs Measures on Trees and Other Graphs • Uniqueness • Mixing Times. • Belief Propagation. • The Replica Method.

  37. A reminder: Markov Chains • A Markov Chain on a (finite) set S is given by an initial distribution  and transition probabilities ti,j. • The probability of ((t))t=0T2 AT+1 is given by [(0)] £t=0T-1t(t),(t+1)  1 2 0 time 3 0 1 2

More Related