[mpich-discuss] Problems with mpiexec.hydra in mpich2-1.2

Bryan Putnam bfp at purdue.edu
Mon Oct 19 12:49:55 CDT 2009


On Mon, 19 Oct 2009, Bryan Putnam wrote:

> On Mon, 19 Oct 2009, Pavan Balaji wrote:
> 
> > Bryan,
> > 
> > > I'll go ahead and create a ticket.
> > 
> > Thanks.
> > 
> > > Here's an example of a PBS_NODEFILE for which it fails.
> > > Here for example I requested 2 nodes, and 8 processors on each node. So, 
> > > each processor name occurs 8 times in the PBS_NODEFILE
> > 
> > This seems very much like the ticket I had mentioned. Can you try out
> > one the latest nightly snapshot of Hydra
> > (http://www.mcs.anl.gov/research/projects/mpich2/downloads/tarballs/nightly/hydra)
> > to see if this has been fixed?
> 
> Pavan,
> 
> I built a new mpiexec and pmi_proxy using the latest tarball. Now I'm 
> seeing
> 
> coates-a001 1058% 
> /home/ba01/u100/bfp/hydra-r5484/64/nemesis-intel-11.1.038/bin/mpiexec -rmk 
> pbs ./hellof
> Floating point exception

I should add that hydra-r5484 actually did fix my original problem with

mpiexec.hydra -f $PBS_NODEFILE -np 4 ./hellof

not working (that's OK now), however

mpiexec.hydra -rmk pbs ./hellof

is now giving the Floating point exception.

Thanks,
Bryan

> 
> for both cases.
> 
> > 
> >  -- Pavan
> > 
> > -- 
> > Pavan Balaji
> > http://www.mcs.anl.gov/~balaji
> > _______________________________________________
> > mpich-discuss mailing list
> > mpich-discuss at mcs.anl.gov
> > https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
> > 
> 
> 
> _______________________________________________
> mpich-discuss mailing list
> mpich-discuss at mcs.anl.gov
> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
> 




More information about the mpich-discuss mailing list