[mpich-discuss] Not able to run MPI program parallely...
Pavan Balaji
balaji at mcs.anl.gov
Tue May 1 18:15:32 CDT 2012
Please add the path to your .bashrc. You'll find the instructions for
this in the MPICH2 README.
-- Pavan
On 05/01/2012 02:37 PM, Albert Spade wrote:
> Hi Pavan,
>
> As you said I installed new mpiexec. Now I am getting this error.
> Whether I have to do this new mpiexec changes to all the remaining four
> nodes?
> Here is my partial screenshot:
> -----------------------------------
> [root at beowulf bin]# cd bin/
> [root at beowulf bin]# ls
> hydra_nameserver hydra_pmi_proxy mpiexec.hydra
> hydra_persist mpiexec mpirun
> [root at beowulf bin]# pwd
> /opt/mpich2-1.4.1p1/bin/bin
> [root at beowulf bin]# mpiexec -n 4 /opt/mpich2-1.4.1p1/examples/./cpi
> -bash: mpiexec: command not found
> [root at beowulf bin]# ls
> hydra_nameserver hydra_pmi_proxy mpiexec.hydra
> hydra_persist mpiexec mpirun
> [root at beowulf bin]# mpiexec -n 4 /opt/mpich2-1.4.1p1/examples/./cpi
> -bash: mpiexec: command not found
> [root at beowulf bin]# PATH=/opt/mpich2-1.4.1p1/bin/bin/:$PATH
> [root at beowulf bin]# export PATH
> [root at beowulf bin]# which mpiexec
> /opt/mpich2-1.4.1p1/bin/bin/mpiexec
> [root at beowulf bin]# mpiexec -n 4 /opt/mpich2-1.4.1p1/examples/./cpi
> Process 0 of 4 is on beowulf.master
> Process 1 of 4 is on beowulf.master
> Process 2 of 4 is on beowulf.master
> Process 3 of 4 is on beowulf.master
> pi is approximately 3.1415926544231239, Error is 0.0000000008333307
> wall clock time = 0.000260
> [root at beowulf bin]# mpiexec -f /root/hosts -n 4
> /opt/mpich2-1.4.1p1/examples/./cpi
> bash: /opt/mpich2-1.4.1p1/bin/bin//hydra_pmi_proxy: No such file or
> directory
> bash: /opt/mpich2-1.4.1p1/bin/bin//hydra_pmi_proxy: No such file or
> directory
> bash: /opt/mpich2-1.4.1p1/bin/bin//hydra_pmi_proxy: No such file or
> directory
> ^C[mpiexec at beowulf.master] Sending Ctrl-C to processes as requested
> [mpiexec at beowulf.master] Press Ctrl-C again to force abort
> [mpiexec at beowulf.master] HYD_pmcd_pmiserv_send_signal
> (./pm/pmiserv/pmiserv_cb.c:166): assert (!closed) failed
> [mpiexec at beowulf.master] ui_cmd_cb (./pm/pmiserv/pmiserv_pmci.c:79):
> unable to send signal downstream
> [mpiexec at beowulf.master] HYDT_dmxu_poll_wait_for_event
> (./tools/demux/demux_poll.c:77): callback returned error status
> [mpiexec at beowulf.master] HYD_pmci_wait_for_completion
> (./pm/pmiserv/pmiserv_pmci.c:205): error waiting for event
> [mpiexec at beowulf.master] main (./ui/mpich/mpiexec.c:437): process
> manager error waiting for completion
> [root at beowulf bin]#
>
>
> On Wed, May 2, 2012 at 12:32 AM, Pavan Balaji <balaji at mcs.anl.gov
> <mailto:balaji at mcs.anl.gov>> wrote:
>
>
> On 05/01/2012 02:01 PM, Albert Spade wrote:
>
> Thanks alot for your help.
> Can you please tell me how can I recompile with this new
> mpiexec?? As in
> ui folder I found mpiexec.c and mpiexec.h.
> So whether first I have to compile mpiexec.c and create mpiexec. And
> then use this mpiexec to compile my cpi.c program?
> Sorry for my simple query. I am new to the field of clusters and
> MPI.
>
>
> ./configure --prefix=`pwd`/install && make && make install
>
> You'll find the new mpiexec in `pwd`/install/bin
>
>
> -- Pavan
>
> --
> Pavan Balaji
> http://www.mcs.anl.gov/~balaji <http://www.mcs.anl.gov/%7Ebalaji>
>
>
--
Pavan Balaji
http://www.mcs.anl.gov/~balaji
More information about the mpich-discuss
mailing list