1 / 11

The Lapack for Clusters (LFC) Project

The LAPACK for Clusters (LFC) project provides an efficient and scalable implementation of LAPACK linear algebra routines for solving linear systems in parallel on cluster environments. It supports interactive clients such as Mathematica, MATLAB, and Python. LFC offers both in-memory and disk-based routines, with limitations on data size and I/O costs. The project also provides ease of deployment and integrates with various languages and technologies.

chavezl
Download Presentation

The Lapack for Clusters (LFC) Project

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Lapack for Clusters (LFC) Project Piotr Luszczek The MathWorks, Inc.

  2. Clients Tunnel (IP, TCP, ....) Server Software Firewall 0,0 Server LFC: Overview • Interactive clients • Mathematica • MATLAB • Python

  3. LFC: Behind the scenes x = lfc.gesv(A, b) Batch mode bypass x = b.copy() command('pdgesv', A.id, x.id) call pdgesv(A, x) Internet Intranet Shared memory ... send(c_sckt, buf) recv(s_sckt, buf)

  4. LFC: Current functionality • Linear systems (via factorizations) • Cholesky: A = UTU A = LLT • Gaussian: PA = LU • Least Squares: A = QR • Singular- and eigen-value problems • A = UΣVT (thin SVD) • AV=VΛ=AHV (symmetric AEP) • AV = VΛ (non-symmetric AEP) • Norms, condition-number estimates • Precision, data types • Single, double • Real, complex • Mixed precision (by upcasting) • User data routines • Loading/Saving • MPI I/O • ... • Generating • Plug-ins • Moving • More to come… • Now working on sparse matrices support

  5. LFC: Design and implementation Prototyping Deployment • C • MathLink • MEX Java Client logic (Python) PLW Jython Network beowulf $ ./configure beowulf $ make beowulf $ make install beowulf $ mpiexec server Server logic (Python) PLW Pyrex beowulf $ python setup.py install beowulf $ mpirun python server.py

  6. LFC: Implemented MATLAB functionality

  7. MATLAB with LFC (parallel):n = lfc(1000); nrhs = 1; A = rand(n); b = rand(n, 1); x = A \ b; r = A*x – b; norm(r, ‘fro’) MATLAB – no LFC (sequential): n = 1000; nrhs = 1; A = rand(n); b = rand(n, 1); x = A \ b; r = A*x – b; norm(r, ‘fro’) Python with LFC (parallel): n = 1000 nrhs = 1 A = lfc.rand(n) b = lfc.rand(n, 1) x = lfc.solve(A, b) r = A*x – b print r.norm(‘F’) Sample LFC code: Linear system solve

  8. In-memory routines LFC_gels() LFC_gesv() LFC_posv() Limitations Data must fit on caller Cluster state functions LFC_hosts_create() LFC_hosts_free() LFC_get_available_CPU() LFC_get_free_memory() Disk-based routines LFC_gels_file_all() LFC_gesv_file_all() LFC_posv_file_all() Limitations Must pay I/O cost each time LFC’s C interface: Sequential calls, parallel execution

  9. Only one file to download Just type: ./configure && make && make install LFC’s ease of deployment LFC clients: C, Mathematica, MATLAB, Python serial interface ScaLAPACK parallel interface PBLAS global addressing local addressing LAPACK BLAS (LFC) portable platform specific BLACS BLAS (vendor) MPI External to LFC External to LFC

  10. Client Python Sockets MATLAB Embedded Java Jython Reuse of Python code C/C++ Code: fork() execvp() Shell (implicit) mpirun mpiexec Server Libraries MPI BLAS BLACS LAPACK ScaLAPACK Languages Python Pyrex (C wrappers) PLW Python compiler Translation to C Software technology used by LFC

  11. Contact Piotr Luszczek The MathWorks, Inc.(508) 647-6767luszczek@eecs.utk.edu http://icl.cs.utk.edu/lfc/

More Related