1 / 40

Combinatorial and algebraic tools for multigrid

Combinatorial and algebraic tools for multigrid. Yiannis Koutis Computer Science Department Carnegie Mellon University. multilevel methods. www.mgnet.org 3500 citations 25 free software packages 10 special conferences since 1983 Algorithms not always working

talasi
Download Presentation

Combinatorial and algebraic tools for multigrid

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Combinatorial and algebraic tools for multigrid Yiannis Koutis Computer Science Department Carnegie Mellon University

  2. multilevel methods www.mgnet.org • 3500 citations • 25freesoftware packages • 10specialconferences since 1983 Algorithms not always working Limited theoretical understanding

  3. multilevel methods: our goals • provide theoretical understanding • solve multilevel design problems • smallchanges in current software • study structure of eigenspaces of Laplacians • extensions for multilevel eigensolvers

  4. Overview • Quick definitions • Subgraph preconditioners • Support graph preconditioners • Algebraic expressions • Low frequency eigenvectors and good partitionings • Multigrid introduction and current state • Multigrid – Our contributions

  5. quick definitions • Given a graph G, with weights wij • Laplacian: A(i,j) = -wij, row sums =0 • Normalized Laplacian: • (A,B) is a measure of how well B approximates A (and vice-versa)

  6. linear systems : preconditioning • Goal:Solve Ax = b via an iterative method • A is a Laplacian of size n with m edges. Complexity depends on (A,I) and m • Solution: Solve B-1Ax = B-1b • Bz=y must be easily solvable • (A,B) is small • B is the preconditioner

  7. Overview • Quick definitions • Subgraph preconditioners • Support graph preconditioners • Algebraic expressions • Low frequency eigenvectors and good partitionings • Multigrid introduction and current state • Multigrid – Our contributions

  8. combinatorial preconditionersthe Vaidya thread • B is a sparse subgraph of A, possibly with additional edges Solving Bz=y is performed as follows: • Gaussian elimination on degree ·2 nodes of B • A new system must be solved • Recursively call the same algorithm on to get an approximate solution.

  9. combinatorial preconditionersthe Vaidya thread • Graph Sparsification [Spielman, Teng] • Low stretch trees [Elkin, Emek, Spielman, Teng] • Near optimal O(m poly( log n)) complexity

  10. combinatorial preconditionersthe Vaidya thread • Graph Sparsification [Spielman, Teng] • Low stretch trees [Elkin, Emek, Spielman, Teng] • Near optimal O(m poly( log n)) complexity • Focus on constructing a good B • (A,B) is well understood – B is sparser than A • B can look complicated even for simple graphs A

  11. Overview • Quick definitions • Subgraph preconditioners • Support graph preconditioners • Algebraic expressions • Low frequency eigenvectors and good partitionings • Multigrid introduction and current state • Multigrid – Our contributions

  12. 1 2 1 3 1 2 2 1 1 combinatorial preconditionersthe Gremban - Miller thread • the support graph S is bigger than A

  13. combinatorial preconditionersthe Gremban - Miller thread • the support graph S is bigger than A 1 2 1 3 1 2 2 1 1 Quotient 2 2 3 1 1 3 3 4 4 3 4 3 2 1 1 2 1 3 1 2 2 1 1

  14. combinatorial preconditionersthe Gremban - Miller thread • The preconditioner S is often a natural graph • S inherits the sparsity properties of A • S is equivalent to a dense graph B of size equal to that of A : (A,S) = (A,B) • Analysis of (A,S) made easy by work of [Maggs, Miller, Ravi, Woo, Parekh] • Existence of good S by work of [Racke]

  15. Overview • Quick definitions • Subgraph preconditioners • Support graph preconditioners • Algebraic expressions • Low frequency eigenvectors and good partitionings • Multigrid introduction and current state • Multigrid – Our contributions • Other results

  16. algebraic expressions • Suppose we are given m clusters in A • R(i,j) = 1 if the jth cluster contains node i • R is n x m • Quotient • R is the clustering matrix

  17. algebraic expressions • The inverse preconditioner • The normalized version • RT D1/2 is the weighted clustering matrix

  18. Overview • Quick definitions • Subgraph preconditioners • Support graph preconditioners • Algebraic expressions • Low frequency eigenvectors and good partitionings • Multigrid introduction and current state • Multigrid – Our contributions • Other results

  19. good partitions and low frequency invariant subspaces • Suppose the graph A has a good clustering defined by the clustering matrix R • Let • Let y be any vector such that

  20. quality test? good partitions and low frequency invariant subspaces • Suppose the graph A has a good clustering defined by the clustering matrix R • Let • Let y be any vector such that Theorem: The inequality is tight up to a constant for certain graphs

  21. good partitions and low frequency invariant subspaces • Let y be any vector such that • Let x be mostly a linear combination of eigenvectors corresponding to eigenvalues close to  Theorem: • Prove ? • We can find random vector x and check the distance to the closest y

  22. Overview • Quick definitions • Subgraph preconditioners • Support graph preconditioners • Algebraic expressions • Low frequency eigenvectors and good partitionings • Multigrid introduction and current state • Multigrid – Our contributions

  23. multigrid – short introduction • General class of algorithms • Richardson iteration: • High frequency components are reduced:

  24. initial and smoothed error

  25. how many? which iteration ? recursion is this needed ? the basic multigrid algorithm Define a smaller graphQ Define a projection operator Rproject Define a lift operator Rlift • Apply t rounds of smoothing • Take the residual r = b-Axold • SolveQz = Rprojectr • Form new iterate xnew = xold + Rlift z • Apply t rounds of smoothing

  26. algebraic multigrid (AMG) Goals: The range of Rproject must approximate the unreduced error very well. The error not reduced by smoothing must be reduced by the smaller grid.

  27. algebraic multigrid (AMG) Goals: The range of Rproject must approximate the unreduced error very well. The error not reduced by smoothing must be reduced in the smaller grid. • Jacobi iteration: • or ‘scaled’ Richardson: • Find a clustering • Rproject = (Rlift)T • Q = RprojectT A Rproject

  28. algebraic multigrid (AMG) Goals: The range of Rproject must approximate the unreduced error very well. The error not reduced by smoothing must be reduced in the smaller grid. • Jacobi iteration: • or ‘scaled’ Richardson • Find a clustering [heuristic] • Rproject = (Rlift)T [heuristic] • Q = RprojectT A Rproject

  29. two level analysis • Analyze the maximum eigenvalue of • where • The matrix T1 eliminates the error in • A low frequency eigenvector has a significant component in

  30. two level analysis • Starting hypothesis: Let X be the subspace corresponding to eigenvalues smaller than . Let Y be the null space of Rproject.Assume, <X,Y>2·/ • Two level convergence : error reduced by • Proving the hypothesis ? Limited cases

  31. current state ‘there is no systematic AMG approach that has proven effective in any kind of general context’ [BCFHJMMR, SIAM Journal on Scientific Computing, 2003]

  32. Overview • Quick definitions • Subgraph preconditioners • Support graph preconditioners • Algebraic expressions • Low frequency eigenvectors and good partitionings • Multigrid introduction and current state • Multigrid – Our contributions

  33. our contributions – two level • There exists a good clustering given by R. The quality is measured by the condition number (A,S) • Q = RT A R • Richardson’s with • Projection matrix

  34. our contributions - two level analysis • Starting hypothesis: Let X be the subspace corresponding to eigenvalues smaller than . Let Y be the null space of Rproject = RTD1/2Assume, <X,Y>2·/ • Two level convergence : error reduced by • Proving the hypothesis ? Yes!Using (A,S) • Result holds for t=1 smoothing • Additional smoothings do not help

  35. our contributions - recursion • There is a matrix M which characterizes the error reduction after one full multigrid cycle • We need to upper bound its maximum eigenvalue as a function of the two-level eigenvalues the maximum eigenvalue of M is upper bounded by the sum of the maximum eigenvalues over all two-levels

  36. towards full convergence • Goal: The error not reduced by smoothing must be reduced by the smaller grid A different point of view The small grid does not reduce part of the error. It rather changes its spectral profile.

  37. full convergence for regular d-dimensional toroidal meshes • A simple change in the implementation of the algorithm: • where • T2 has eigenvalues 1 and -1 • T2 xlow = xhigh

  38. full convergence for regular d-dimensional toroidal meshes • With t=O(log log n) smoothings • Using recursive analysis: max(M) · 1/2 • Both pre-smoothings and post-smoothings are needed • Holds for perturbations of toroidal meshes

  39. Overview • Quick definitions • Subgraph preconditioners • Support graph preconditioners • Algebraic expressions • Low frequency eigenvectors and good partitionings • Multigrid introduction and current state • Multigrid – Our contributions

  40. Thanks!

More Related