[mpich-discuss] Not able to run MPI program parallely...

Albert Spade albert.spade at gmail.com
Tue May 1 14:01:09 CDT 2012


Hi Pavan,

Thanks alot for your help.
Can you please tell me how can I recompile with this new mpiexec?? As in ui
folder  I found mpiexec.c and mpiexec.h.
So whether first I have to compile mpiexec.c and create mpiexec. And then
use this mpiexec to compile my cpi.c program?
Sorry for my simple query. I am new to the field of clusters and MPI.

Thanks again.


On Wed, May 2, 2012 at 12:21 AM, Pavan Balaji <balaji at mcs.anl.gov> wrote:

>
> On 05/01/2012 01:39 PM, Albert Spade wrote:
>
>> [mpiexec at beowulf.master] Launch arguments:
>> /opt/mpich2-1.4.1p1/bin/hydra_**pmi_proxy --control-port
>> beowulf.master:60190 --debug --rmk user --launcher ssh --demux poll
>> --pgid 0 --retries 10 --proxy-id 0
>> [mpiexec at beowulf.master] Launch arguments:
>> /opt/mpich2-1.4.1p1/bin/hydra_**pmi_proxy --control-port
>> beowulf.master:60190 --debug --rmk user --launcher ssh --demux poll
>> --pgid 0 --retries 10 --proxy-id 1
>> [mpiexec at beowulf.master] Launch arguments:
>> /opt/mpich2-1.4.1p1/bin/hydra_**pmi_proxy --control-port
>> beowulf.master:60190 --debug --rmk user --launcher ssh --demux poll
>> --pgid 0 --retries 10 --proxy-id 2
>> [mpiexec at beowulf.master] Launch arguments:
>> /opt/mpich2-1.4.1p1/bin/hydra_**pmi_proxy --control-port
>> beowulf.master:60190 --debug --rmk user --launcher ssh --demux poll
>> --pgid 0 --retries 10 --proxy-id 3
>>
>
> Thanks.  It looks like mpiexec thinks that all four nodes are local nodes,
> which is a bug that was fixed after 1.4.1p1 was released.  Can you try the
> mpiexec from 1.5b1:
>
> http://www.mcs.anl.gov/**research/projects/mpich2/**
> downloads/tarballs/1.5b1/**hydra-1.5b1.tar.gz<http://www.mcs.anl.gov/research/projects/mpich2/downloads/tarballs/1.5b1/hydra-1.5b1.tar.gz>
>
> You don't need to recompile the application.  Just run it with this new
> mpiexec.
>
>  -- Pavan
>
>
> --
> Pavan Balaji
> http://www.mcs.anl.gov/~balaji
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/mpich-discuss/attachments/20120502/6e9ef357/attachment.htm>


More information about the mpich-discuss mailing list