[mpich-discuss] Parallel Jobs

Gus Correa gus at ldeo.columbia.edu
Thu Dec 16 15:36:11 CST 2010


Talla wrote:
> Hello;
> I build a cluster using Rocks/Open CentOS and I need to perform a 
> parallel job on it, therefore I am looking for complete instructions how 
> to set up MPICH to get it run correctly as I am struggling with it for  
> while.
> 
> Your help is really appreciated, Cordially.
> 
Hi

I may have answered you in the Rocks list already.
Anyway, there it goes again.

0) READ the MPICH2 Installation Guide and User Guide first.
There are complete instructions there.
Did you read them?

1) Install one (ONLY ONE) of the resource manager rolls: SGE or Torque.
(I use and prefer Torque, which is simpler, but other people love SGE.)

2) Download the latest greatest MPICH2 tarball to 
/share/apps/mydownloads, and untar it (tar -zxvf mpich2-... ).
Configure and install MPICH2 on /share/apps/mpich2.
You can use gcc, g++, and gfortran compilers for this,
something like this:

configure --prefix=/share/apps/mpich2 CC=gcc CXX=g++ F77=gfortran
F90=gfortran
make
make install

3) Set the environment varialbles in your .cshrc or .bashrc file
(as a regular user, NOT root), so that:

/share/apps/mpich2/bin is first in your PATH,
/share/apps/mpich2/share/man is first in your MANPATH,
/share/apps/mpich2/lib is first in your LD_LIBRARY_PATH.

4) Compile the cpi.c example in the "examples"
directory in the MPICH2 *distribution* directory
(i.e. /share/apps/mydownloads/mpich2-.../examples)
using mpicc

mpicc -o cpi cpi.c

5) Write a job submission script for it.
Here is one for Torque:

#PBS -N cpi
#PBS -q default
#PBS -l nodes=2:ppn=4  #example asking for 2 nodes with 4 cores each

mpiexec.hydra -f $PBS_NODEFILE -np 8 cpi

6) Alternatively, if you don't want to use Torque or SGE, create a 
"mynodefile" according to the MPICH2, and launch the job
directly on the command line:

mpiexec.hydra -f mynodefile -np 8 cpi

Also

mpiexec.hydra --help

will give you a lot of information about it.

Good luck
Gus Correa



More information about the mpich-discuss mailing list