1 / 45

Optimization in mean field random models

Optimization in mean field random models. Johan Wästlund Linköping University Sweden. Statistical Mechanics. Each particle has a spin Energy = Hamiltonian depends on spins of interacting particles Ising model: Spins ± 1, H = # interacting pairs of opposite spin .

erzsebet
Download Presentation

Optimization in mean field random models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Optimization in mean field random models Johan Wästlund Linköping University Sweden

  2. Statistical Mechanics • Each particle has a spin • Energy = Hamiltonian depends on spins of interacting particles • Ising model: Spins ±1, H = # interacting pairs of opposite spin

  3. Statistical Mechanics • Spin configuration  has energy H() • Gibbs measure depends on temperature T: • T→∞ random state • T→0 ground state, i.e. minimizing H()

  4. Statistical Mechanics • Thermodynamic limit N →∞ • Average energy? (suitably normalized)

  5. Disordered Systems • Spin glasses • AuFe random alloy • Fe atoms interact

  6. Disordered Systems • Random interactions between Fe atoms • Sherrington-Kirkpatrick model:

  7. Disordered Systems • Quenched random variables gi,j • S-K is a mean field model: No correlation betweeen quenched variables • NP hard to find ground state given gi,j

  8. Computer Science • Test / evaluate heuristics for NP-hard problems • Average case analysis • Random problem instances

  9. Combinatorial Optimization • Minimum Matching / Assignment • Minimum Spanning Tree • Traveling Salesman • Shortest Path • … • Points with given distances, minimize total length of configuration

  10. Statistical Physics / Computer Science • Feasible solution • Cost of solution • Cost of minimal solution • Artificial parameter T • Gibbs measure • N→∞ • Spin configuration • Hamiltonian • Ground state energy • Temperature • Gibbs measure • Thermodynamic limit

  11. Mean field models • Replica-cavity method has given good results for mean field models • Parisi solution of S-K model • The same methods can be applied to combinatorial optimization problems in mean field models

  12. Mean field models of distance • N points • Abstract geometry • Inter-point distances given by i. i. d. random variables • Exponential distribution easiest to analyze (pseudodimension 1)

  13. Matching • Set of edges giving a pairing of all points

  14. Spanning tree • Network connecting all points

  15. Traveling salesman • Tour visiting all points

  16. Mean field limits • No normalization needed! (pseudodimension 1) • Matching: 2/12≈0.822 (Mézard & Parisi 1985, rigorous proof by Aldous 2000) • Spanning tree: (3) = 1+1/8+1/27+… ≈1.202 (Frieze 1985) • Traveling salesman: 2.0415… (Krauth-Mézard-Parisi 1989), now established rigorously!

  17. Cavity results • Non-rigorous method • Aldous derived equivalent equations with the Poisson-Weighted Infinite Tree (PWIT)

  18. Cavity results • Non-rigorous quantity X = cost of minimal solution – cost of minimal solution with the root removed • Define X1, X2, X3,… similarly on sub-trees • Leads to the equation • Xidistributed like X, i are times of events in rate 1 Poisson process

  19. Cavity results • Analytically, this is equivalent to where

  20. Cavity results • Explicit solution • Ground state energy

  21. Cavity results • Note that the integral is equal to the area under the curve when f(u) is plotted against f(-u) • In this case, f satisfies the equation

  22. Cavity results

  23. K-L matching

  24. K-L matching • Similarly, the K-L matching problem leads to the equations: •  has rate K and  has rate L • min[K] stands for K:th smallest

  25. K-L matching • Shown by Parisi (2006) that this system has an essentially unique solution • The ground state energy is given by where x and y satisfy an explicit equation • For K=L=2, this equation is • Unfortunately the cavity method is not rigorous

  26. The exponential bipartite assignment problem n

  27. The exponential bipartite assignment problem • Exact formula conjectured by Parisi (1998) • Suggests proof by induction • Researchers in discrete math, combinatorics and graph theory became interested • Generalizations…

  28. Generalizations • by Coppersmith & Sorkin to incomplete matchings • Remarkable paper by M. Buck, C. Chan & D. Robbins (2000) • Introduces weighted vertices • Extremely close to proving Parisi’s conjecture!

  29. Incomplete matchings n m

  30. Weighted assignment problems • Weights 1,…,m, 1,…, n on vertices • Edge cost exponential of rate ij • Conjectured formula for the expected cost of minimum assignment • Formula for the probability that a vertex participates in solution (trivial for less general setting!)

  31. a3 a1 a2 The Buck-Chan-Robbins urn process • Balls are drawn with probabilities proportional to weight

  32. Proofs of the conjectures • Two independent proofs of the Parisi and Coppersmith-Sorkin conjectures were announced on March 17, 2003 (Nair, Prabhakar, Sharma and Linusson, Wästlund)

  33. Annealing • Powerful idea: Let T→0, forcing the system to converge to its ground state • Replica-cavity approach • Simulated annealing meta-algorithm (optimization by random local moves)

  34. In the mean field model: Underlying rate 1 variables Yi • ri plays the same role as T • Local temperature • Associate weight to vertices rather than edges

  35. Cavity/annealing method • Relax by introducing an extra vertex • Let the weight of the extra vertex go to zero • Example: Assignment problem with 1=…=m=1, 1=…=n=1, and m+1 =  • p = P(extra vertex participates) • p/n = P(edge (m+1,n) participates)

  36. Annealing • p/n = P(edge (m+1,n) participates) • When →0, this is • Hence • By Buck-Chan-Robbins urn theorem,

  37. Annealing • Hence • Inductively this establishes the Coppersmith-Sorkin formula

  38. Results with annealing • Much simpler proofs of Parisi, Coppersmith-Sorkin, Buck-Chan-Robbins formulas • Exact results for higher moments • Exact results and limits for optimization problems on the complete graph

  39. The 2-dimensional urn process • 2-dimensional time until k balls have been drawn

  40. Limit shape as n→∞ • Matching: • TSP/2-factor:

  41. Mean field TSP • If the edge costs are i.i.d and satisfy P(l<t)/t→1 as t→0 (pseudodimension 1), then as n →∞, • A. Frieze proved that whp a 2-factor can be patched to a tour at small cost

  42. Further exact formulas

  43. LP-relaxation of matching in the complete graph Kn

  44. Future work • Explain why the cavity method gives the same equation as the limit shape in the urn process • Establish more detailed cavity predictions • Use proof method of Nair-Prabhakar-Sharma in more general settings

  45. Thank you!

More Related