Q-Logic IB6054601-00 D manual Configuring MPI Programs for InfiniPath MPI, Run it

Models: IB6054601-00 D

1 122
Download 122 pages 48.66 Kb
Page 46
Image 46

3 – Using InfiniPath MPI

Configuring MPI Programs for InfiniPath MPI

Q

and run it with:

$ mpirun -np 2 -m mpihosts ./pi3f90

The C++ program hello++.cc is a parallel processing version of the traditional “Hello, World” program. Notice that this version makes use of the external C bindings of the MPI functions if the C++ bindings are not present.

Compile it:

$ mpicxx -o hello hello++.cc

and run it:

$ mpirun -np 10 -m mpihosts ./hello Hello World! I am 9 of 10

Hello World! I am 2 of 10

Hello World! I am 4 of 10

Hello World! I am 1 of 10

Hello World! I am 7 of 10

Hello World! I am 6 of 10

Hello World! I am 3 of 10

Hello World! I am 0 of 10

Hello World! I am 5 of 10

Hello World! I am 8 of 10

Each of the scripts invokes the PathScale compiler for the respective language and the linker. See section 3.5.3 for an example of how to use the gcc compiler. The use of mpirun is the same for programs in all languages.

3.4

Configuring MPI Programs for InfiniPath MPI

When configuring an MPI program (generating header files and/or Makefiles), for InfiniPath MPI, you will usually need to specify mpicc, mpif90, etc. as the compiler, rather than pathcc, pathf90, etc.

Typically this is done with commands similar to these (this assumes you are using sh or bash as your shell):

$ export CC=mpicc

$ export CXX=mpicxx $ export F77=mpif77 $ export F90=mpif90 $ export F95=mpif95

The shell variables will vary with the program being configured, but these examples show frequently used variable names. Users of csh would instead use commands similar to:

$ setenv CC mpicc

3-4

IB6054601-00 D

Page 46
Image 46
Q-Logic IB6054601-00 D manual Configuring MPI Programs for InfiniPath MPI, Run it