[mpich-discuss] Problems with mpiexec.hydra in mpich2-1.2

Bryan Putnam bfp at purdue.edu
Mon Oct 19 10:43:33 CDT 2009


On Mon, 19 Oct 2009, Pavan Balaji wrote:

> 
> > mpiexec.hydra -rmk ./hellof
> > 
> > does appear to work correctly if I do something like
> > 
> > qsub -I -l select=2:ncpus=1:mpiprocs=1,place=scatter
> > mpiexec.hydra -rmk ./hellof
> > 
> > In that case, since there is only 1 processor per node, each node name 
> > only occurs once in the PBS_NODEFILE.
> 
> Can you send me the PBS_NODEFILE for the case where it doesn't work?
> Also, it'll be great if you can create a ticket here for it:
> https://trac.mcs.anl.gov/projects/mpich2/newticket

I'll go ahead and create a ticket.

Here's an example of a PBS_NODEFILE for which it fails.
Here for example I requested 2 nodes, and 8 processors on each node. So, 
each processor name occurs 8 times in the PBS_NODEFILE

coates-adm 1073% qsub -I -l select=2:ncpus=8:mpiprocs=8,place=scatter
qsub: waiting for job 80604.coates-adm.rcac.purdue.edu to start
qsub: job 80604.coates-adm.rcac.purdue.edu ready

coates-a002 1001% cat $PBS_NODEFILE
coates-a002
coates-a002
coates-a002
coates-a002
coates-a002
coates-a002
coates-a002
coates-a002
coates-a003
coates-a003
coates-a003
coates-a003
coates-a003
coates-a003
coates-a003
coates-a003

Here's an example that does work correctly

coates-adm 1075% qsub -I -l select=16:ncpus=1:mpiprocs=1,place=scatter
qsub: waiting for job 80631.coates-adm.rcac.purdue.edu to start
qsub: job 80631.coates-adm.rcac.purdue.edu ready

coates-a002 1001% cat $PBS_NODEFILE
coates-a002
coates-a003
coates-a055
coates-a055
coates-a057
coates-a058
coates-a059
coates-a060
coates-a061
coates-a062
coates-a063
coates-a064
coates-a065
coates-a066
coates-a067
coates-a068

Here's I've only requested one processor per node, and each processor name 
only appears once in the PBS_NODEFILE.

Bryan

> 
> Thanks,
> 
>  -- Pavan
> 
> -- 
> Pavan Balaji
> http://www.mcs.anl.gov/~balaji
> _______________________________________________
> mpich-discuss mailing list
> mpich-discuss at mcs.anl.gov
> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
> 




More information about the mpich-discuss mailing list