1 / 22

JAVA AND MATRIX COMPUTATION

JAVA AND MATRIX COMPUTATION. Geir Gundersen Department of Informatics University of Bergen Norway Joint work with Trond Steihaug. Java Grande : Several extension to the Java language has been proposed but NOT integrated or considered by Sun Microsystems.

julius
Download Presentation

JAVA AND MATRIX COMPUTATION

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. JAVA AND MATRIX COMPUTATION Geir Gundersen Department of Informatics University of Bergen Norway Joint work with Trond Steihaug

  2. Java Grande: Several extension to the Java language has been proposed but NOT integrated or considered by Sun Microsystems. Our vision: What has the Java language to offer in the field of numerical computation as is? Has JAVA something to offer in the field of Sparse Matrix Computations?

  3. Objectives • Java and Scientific Computing: • Java will be used for (limited) numerical computations. • Jagged Arrays: • Static and Dynamic operations. • Challenges: • Dense Matrix Computations. • C#. • Future Topics.

  4. Benchmarking Java against C and FORTRAN for Scientific Applications • Java will loose in general for most kernels, a factor of 2-4 times, but some benchmarking shows that Java will compete and even win for other kernels (on some platforms). • There is still progress in JVM and compiler optimizing. The gap between FORTRAN/C and Java will in the future get smaller(?) • Benchmarking results are important but not in the scope of this work.

  5. Java Arrays • Java arrays are objects. • Thus creating an array is object creation. • The objects of an array of objects are not necessarily stored contiguously. • An array of objects stores references to the actual objects. • The primitive elements of an array are most likely stored contiguously. • An array of primitive elements holds the actual values for those elements. • We utilize that Java Arrays need not to be rectangular and each inner array can have its own size, and that Array Aliasing is allowed.

  6. Java Arrays: Matrix Examples

  7. Sparse Matrices • A sparse matrix is usually defined as a matrix where "many" of its elements are equal to zero. • We benefit both in time and space by working only on the nonzero data structure. • Sparse Matrices have a wide variety of structures that defines several different data structures. • Figures shows Sherman banded, Simplex unsymmetrical, symmetric and pent diagonal.

  8. Sparse Matrices: Examples

  9. Compressed Row Storage • The most commonly used storage schemes for large sparse matrices: • Compressed Row/Column Storage • These storage schemes have enjoyed several decades of research • The compressed storage schemes have minimal memory requirements for storing a general sparse matrices.

  10. Java Sparse Array • Java Sparse Array (JSA) is a new concept for storing sparse matrices made possible with Java. • One array for storing the references to the value arrays and one for storing the references to the index arrays. • There is no need for an enclosing object of the arrays.

  11. Matrix Vector and Vector Matrix • Static operations. • Traverse only the data structures involved (only the vector c(=Ab) is created). • Numerical results indicates no significant loss in efficiency when traversing a 2D jagged array compared to a 1D array.

  12. Sparse Matrix Multiplication • Dynamic operations. • Creates each row of the resulting matrix C(=AB). • Numerical results indicates no significant loss in efficiency using jagged arrays in dynamic operations. • Symbolic Phase and Numerical Phase. • Jagged Arrays: One phase leads to locality. • CRS: Two separate phases.

  13. The Update Algorithm

  14. Jagged Variable Band Storage • In this data structure, all matrix elements from the first nonzero in each row to the last nonzero in the row are explicitly stored. • No loss with Matrix Vector and Vector Matrix in efficiency comparing JVBS with traditionally data structures. • No loss with Sparse Matrix Multiplication in efficiency comparing JVBS with traditionally data structures. • Traversing only non-zero elements with JSA and JVBS. JVBS can compete with JSA in performance. • Density comparing JSA to JVBS when is JVBS more efficient. This since there are no indirect index addressing using JVBS.

  15. VBS: Storing tridiagonal matrices • For tri-diagonal (ui = li), matrices, the rows i = 1,2...,m-2 have the same upper and lower bandwidth and only one bandwidth array is stored, this is accomplished by using array aliasing to store the references to this array from their respective row positions. • No modification to algorithms that work static on such a structure compared to the original JVBS. • For more dynamic operations the algorithms need to be modified.

  16. Dense Matrices: Row versus Column Traversing

  17. C# Timings • Jagged arrays (Java like) for example double[m][n]. Not necessarily stored contiguously in memory. • Multidimensional arrays (double[m,n]). Contiguously stored block in the memory. • Row-oriented. • Row versus Column traversing on a Multidimensional array (m=n) is an average of 3.54 times. • The difference between row and column traversing on a Jagged array (m=n) is an average of 5.71 times.

  18. C# Timings • Row traversing: numerical results shows that Jagged Array are more efficient than Multidimensional Array with an average of 1.65 times. • Column traversing: numerical results shows that Jagged Array are slightly more efficient than Multidimensional Array. • High Performance Computing view: Jagged Arrays rather than Multidimensional Arrays seems appropriate.

  19. C# Timings

  20. The Impact of Java and C# • The future will be Java and C# for commercial and educational use. • Commercial applications written in Java and C# will include scientific applications. • Java: Portability is especially important for high performance application, where the hardware architecture has a shorter lifespan than the application software. • C#/.NET: Wide range of features are promised, but unfortunately still a one platform show. • C# versus Java: Is C# a better alternative then Java? • Java Grande Forum: C# might be the answer for some operations like parallel programming.

  21. Future Topics • Solving Large Sparse Linear System of Equations: • Sparse Gaussian Elimination with partial pivoting • Multidimensional Matrices for Tensor Methods: • Tensor methods gives 3D structures where sparsity is an important issue. • Parallel Java: Threads • Threads allow multiple activities to proceed concurrently in the same program. • Parallel programming can only be achieved on a multiple processor platform. • Suggested extensions to the Java language are OpenMP and MPI.

  22. Concluding Remarks • We have shown for basic data structures there is a lot to gain in utilizing the flexibility (independently row updating) . • Challenges as row versus column traversing for a 2D square matrices. • This is just the beginning: • Java Threads leads to parallel computing but only on the premises of Java. • Other data structures must be investigated. • Graphs. • Applications: • Optimization and Numerical solution of PDEs. • People will use Java (and C#) for numerical computations, therefore it may be useful to invest time and resources finding how to use Java for numerical computation.

More Related