1 / 22

Toward an automatic parallel tool for solving systems of nonlinear equations

Toward an automatic parallel tool for solving systems of nonlinear equations. Antonio M. Vidal Jesús Peinado. Departamento de Sistemas Informáticos y Computación Universidad Politécnica de Valencia. Solving Systems of Nonlinear Equations. Newton’s iteration:. Newton’s Algorithm.

shanae
Download Presentation

Toward an automatic parallel tool for solving systems of nonlinear equations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Toward an automatic parallel tool for solving systems of nonlinear equations Antonio M. Vidal Jesús Peinado Departamento de Sistemas Informáticos y Computación Universidad Politécnica de Valencia

  2. Solving Systems of Nonlinear Equations Newton’siteration: Newton’s Algorithm

  3. Methods to solve Nonlinear Systems • Newton’s Methods: To solve the linear system by using a direct method (LU, Cholesky,..) Several approaches : Newton, Shamanskii, Chord,.. • Quasi-Newton Methods: To approximate the Jacobian matrix . (Broyden Method, BFGS,...) B(xc) ≈ J(xc) B(x+)= B(xc)+uvT • Inexact Newton Methods : To solve the linear system by using an iterative method (GMRES, C. Gradient,..) . ||J(xk)sk+ F(xk)||2 = ηk ||F(xk)||2

  4. Difficulties in the solution of Nonlinear Systems by a non-expert Scientist • Several methods • Slow convergence • A lot of trials are needed to obtain the optimum algorithm • If parallelization is tried the possibilities increase dramatically: shared memory, distributed memory, passing message environments, computational kernels, several parallel numerical libraries,… • No help is provided by libraries to solve a nonlinear system

  5. Objective • To achieve a software tool which automatically obtains the best from a sequential or parallel machine for solving a nonlinear system, for every problem and transparently to the user

  6. Work done • A set of parallel algorithms have been implemented: Newton’s, Quasi-Newton and Inexact Newton algorithms for symmetric and nonsymmetric Jacobian matrices • Implementations are independent of the problem • They have been tested with several problems of different kinds • They have been developed by using the support and the philosophy of ScaLAPACK • They can be seen as a part of a more general environment related to software for message passing machines

  7. SCALAPACK • Example of distribution for solving a linear system with J Jacobian Matrix and F problem function • Programming Model: SPMD. • Interconnection network: Logical Mesh • Two-dimensional distribution of data: block cyclic

  8. Message-passing BLAS primitives (MPI, PVM, ...) Software environment Authomatic Parallel Tool USER Numerical Paralell Algorithms ScaLAPACK Scalable Linear Algebra Package MINPACK Global Minimization PBLAS Parallel BLAS Package LAPACK Linear Algebra Package Basic Linear Algebra BLACS Communication Subroutines Other CERFACS: Local packages.. CG,GMRES Iterative Solvers Basic Linear Algebra Subroutines

  9. Developing a systematic approachHow to chose the best method? • Specification of data problem • Starting point. • Function F. • Jacobian Matrix J. • Structure of Jacobian Matrix (dense, sparse, band, …) • Required precision. • Using of chaotic techniques. • Possibilities of parallelization (function, Jacobian Matrix,…). • Sometimes only the Function is known: Prospecting with a minimum simple algorithm (Newton+finite differences+sequential approach) can be interesting

  10. La metodología(1).Esquema general

  11. 2 C + k ( C + C + n ) 3 3 C N E J 2 C + k ( C + n + m ( C + 2 n )) 3 2 c 3 S J E 2 C + C + n + k ( C + 2 n ) 3 2 3 c J C E n 3 C + k ( C + C + ) 3 C NCH E J 4 C + C + C + n + k ( C + 29 n ) 3 2 3 C E J B E n n 3 3 ( 2 ) ( ) ( )( 15 ) + + + + + + + + - C C C k n 2 C m C k m n 2 3 3 C E J BF E J BF C + C + k ( C + C + + C ) 2 2 k n m C E NG E J G E C + C + k ( C + n + C ) 2 k C E NCG J CG E Developing a systematic approach Method flops Newton Shamanskii Chord Newton-Cholesky Broyden BFGS Newton-GMRES Newton-CG CE= Function evaluation cost; CJ=Jacobian matrix evaluation cost

  12. J O(n) O(n ) O(n ) O(n ) >O(n ) 2 3 4 4 F O(n) P P P P P 11 12 13 14 1+ O(n ) P P P P P 2 21 22 23 24 2+ O(n ) P P P P P 3 31 32 33 34 3+ O(n ) P P P P P 4 41 42 43 44 4+ >O(n ) P P P P P 4 +1 +2 +3 +4 ++ Developing a systematic approach • Function and Jacobian Matrix characterize the nonlinear system • It is important to know features of both: sparse or dense, how to compute (sequential or parallel), structure,… • It is be interesting to classify the problems according to their cost, specially to identify the best method or to avoid the worst method and to decide what must be parallelized

  13. Developing a systematic approach • Once the best sequential option has been selected the process can be finalized • If the best parallel algorithm is required the following items must be analyzed: • Computer architecture: (tf, t, b ) • Programming environments: PVM/MPI…. • Data distribution to obtain the best parallelization. • Cost of the parallel algorithms

  14. Developing a systematic approach Data Distribution It depends on the parallel environment. In the case of ScaLAPACK: Cyclic by blocks distribution: optimize the size of block and the size of the mesh Parallelization chances Function evaluation and/or Computing the Jacobian matrix. Parallelize the more expensive operation! Cost of the parallel algorithms Utilize the table for parallel cost with the parameters of the parallel machine: (tf, t, b)

  15. Developing a systematic approachFinal decision for chosing the method Cost < O(n3) => 0; Cost >= O(n3) => 1

  16. Developing a systematic approachFinal decision for parallelization No chance of parallelization => 0; Chance of parallelization => 1

  17. Developing a systematic approach Finish or feedback: IF selected method is convenient THEN finish ELSE feedback Sometimes bad results are obtained due to: • No convergence. • High computational cost • Parallelization no satisfactory.

  18. La metodología(12).Esquema del proceso guiado

  19. La metodología(12).Esquema del proceso guiado

  20. Anal.Jac. Fin.Dif. Jac How does it work? • Inverse Toeplitz Symmetric Eigenvalue Problem • Well known problem: Starting point, function, analytical Jacobian matrix or finite difference approach, … • Kind of problem • Cost of Jacobian matrix high: Avoid compute it. Use Chord o Broyden. • High chance of parallelization, even if finite difference is used. • If speed of convergence is slow use Broyden but insert some Newton iterations.

  21. How does it work? • Leakage minimization in a network of water distribution • Well known problem: Starting point, function, analytical Jacobian matrix or finite difference approach, … • Jacobian matrix: symmetric, positive def. • Kind of problem • Avoid methods with high cost of a iteration like Newton-Cholesky • Computation of F and J can be parallelized. • Use Newton-CG (to speed-up convergence) or BFGS

  22. Conclusions • Part of this work has been done in the Ph.D. Thesis of J.Peinado: “Resolución Paralela de Sistemas de Ecuaciones no Lineales”. Univ.Politécnica de Valencia. Sept. 2003 • All specifications and parallel algorithms have been developed • Implementation stage of the automatic parallel tool starts in January 2004 in the frame of a CITYT Project: “Desarrollo y optimización de código paralelo para sistemas de Audio 3D”. TIC2003-08230-C02-02

More Related