1 / 18

CS 420

CS 420. Overview of MPI MPI Concepts Basic MPI Functions. MPI Overview. MPI = Message Passing Interface A mechanism/protocol/standard for developing and implementing parallel programs. MPI. MPI evolved from other message passing systems Other message passing systems include-

weston
Download Presentation

CS 420

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS 420 Overview of MPI MPI Concepts Basic MPI Functions

  2. MPI Overview • MPI = Message Passing Interface • A mechanism/protocol/standard for developing and implementing parallel programs

  3. MPI • MPI evolved from other message passing systems • Other message passing systems include- • PICL, Chameleon, Zipcode, TCGMSG • Notably PVM (Parallel Virtual Machine) • MPI was intended as the successor to PVM

  4. MPI • MPI is not a language • MPI is a library • Can be called from and used by C, C++ and Fortran

  5. MPI • Some advantages of MPI • Standard • MPI implementations tend to stick to the standard • Portable • Versatile • Performance • Small core set of functions

  6. MPI • MPI has a number of flavors (implementations) • MPICH • LAM/MPI • Chimp • MPICH-G (Globus) • MPICH-MX (Myricom Express) • MVICH

  7. MPI • A number of process managers • mpirun • mpiexec • mpd • lam • mpe

  8. MPI Concepts • Messages – packets of data – might be scalar, arrays, strings, structures. • No guarantee regarding the ordering of messages- nondeterministic • Processes vs Processors • Process Ids • Ranks • Blocking and Non-blocking

  9. Five Processor DMC R1 array(1),array(2),… R2 R3 x, y, z, temp,… R4 R5

  10. MPI Concepts • Source and Destination • Can have wildcards • Tags • Arbitrary message ids … but • Be careful

  11. MPI Concepts • Groups • Communicators • MPI_WORLD_COMM – global communicator • MPI Data Types • MPI_INT

  12. Basic MPI functions • … but first include… • #include “mpi.h”

  13. Basic MPI Functions • int MPI_Init(int argc_ptr, char** argv_ptr) • Initializes MPI in the application • MPI_Init(&argc, &argv); • int MPI_Finalize() • Closes/shutdown MPI for the application • MPI_Finalize();

  14. Basic MPI Functions • int MPI_Comm_size(MPI_comm comm, int* size) • Returns the size of the group in size • MPI_Comm_size(MPI_COMM_WORLD, &nprocs) • Int MPI_Comm_rank(MPI_comm comm, int* rank) • Returns the rank (ID) of this process in rank • MPI_Comm_rank(MPI_WORLD_COMM, &myrank);

  15. Basic MPI Functions … So a program might look like this… #include <stdio.h> #include “mpi.h” main(int argc, char* argv[]) { int size; int myrank; MPI_Init(&argc, &argv); MPI_Comm_size(MPI_COMM_WORLD, &size ); MPI_Comm_rank(MPI_COMM_WORLD, &myrank); {other stuff here} MPI_Finalize(); }

  16. Other useful functions • double MPI_Wtime(void) • Returns the walltime from some predetermined point • double starttime = MPI_Wtime(); • double MPI_Wtick(void) • Returns the precision of the timer • double precision = MPI_Wtick(); • Note: times on processes are independent

  17. Other MPI information • LLNL Tutorial • http://www.llnl.gov/computing/tutorials/mpi/ • Argonne National Laboratories • http://www-unix.mcs.anl.gov/mpi/ • The MPI Forum • http://www.mpi-forum.org/ • MPI Book • http://www.netlib.org/utk/papers/mpi-book/mpi-book.html • Real Books – Pacheco, Gropp et. al.

More Related