<div dir="ltr">Hi,<br>I tried even mapping the drive as Jayesh mentioned, but the problem is still the same.<br>If I run the programme only in the master node, then it will run. Otherwise if I use other nodes including master node to run the programme, the programme give the output but it won't exit (mpi finalize does not work or called)<br>
<br>Please help me to over come this issue.<br><br>Regards,<br>Waruna Ranasinghe<br><br><div class="gmail_quote">2008/7/25 Jayesh Krishna <span dir="ltr"><<a href="mailto:jayesh@mcs.anl.gov">jayesh@mcs.anl.gov</a>></span><br>
<blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
<div>
<div dir="ltr" align="left"><font color="#0000ff" size="2" face="Arial"><span>Hi,</span></font></div>
<div dir="ltr" align="left"><font color="#0000ff" size="2" face="Arial"><span> You should be able to use all the nodes (with
MPICH2 installed) for running your job (i.e., You should be able to use the main
node to run your MPI processes).</span></font></div>
<div dir="ltr" align="left"><font color="#0000ff" size="2" face="Arial"><span> If you are using a shared drive to run your
program you should map the drive on all the nodes using the "-map" option of
mpiexec (see the windows developer's guide, available at <a href="http://www.mcs.anl.gov/research/projects/mpich2/documentation/index.php?s=docs" target="_blank">http://www.mcs.anl.gov/research/projects/mpich2/documentation/index.php?s=docs</a>,
for details)</span></font></div>
<div dir="ltr" align="left"><font color="#0000ff" size="2" face="Arial"><span></span></font> </div>
<div dir="ltr" align="left"><font color="#0000ff" size="2" face="Arial"><span>Regards,</span></font></div>
<div dir="ltr" align="left"><font color="#0000ff" size="2" face="Arial"><span>Jayesh</span></font></div><br>
<div dir="ltr" align="left" lang="en-us">
<hr>
<font size="2" face="Tahoma"><b>From:</b> <a href="mailto:owner-mpich-discuss@mcs.anl.gov" target="_blank">owner-mpich-discuss@mcs.anl.gov</a>
[mailto:<a href="mailto:owner-mpich-discuss@mcs.anl.gov" target="_blank">owner-mpich-discuss@mcs.anl.gov</a>] <b>On Behalf Of </b>Waruna
Ranasinghe<br><b>Sent:</b> Friday, July 25, 2008 3:06 AM<br><b>To:</b>
<a href="mailto:mpich-discuss@mcs.anl.gov" target="_blank">mpich-discuss@mcs.anl.gov</a><br><b>Subject:</b> [mpich-discuss] Cannot use the main
node to run a process of the programme<br></font><br></div><div><div></div><div class="Wj3C7c">
<div></div>
<div dir="ltr">Hi all,<br><br>I'm using MPICH2 in Windows.<br>I can run my
programme without errors if I don't use the machine in which I execute the
command (Main node).<br><br>mpiexec -channel ssm -n 3 -exitcodes -machinefile
"c:\Program Files\MPICH2\bin\hosts.txt" -wdir //<a href="http://10.8.102.27/ClusterShared" target="_blank">10.8.102.27/ClusterShared</a>
GBMTest<br><br>If I use the main node also to execute one of the 3 processes,
then it gives the error below. But it prints the output I wanted too. then it
gives the error.<br>I wanted to know whether this is an issue with my
programme(GBMTest) or I cant use the main node to run the process.<br>In the
machinefile I have included three machines. <br><a href="http://10.8.102.28" target="_blank">10.8.102.28</a><br><a href="http://10.8.102.30" target="_blank">10.8.102.30</a><br><a href="http://10.8.102.27" target="_blank">10.8.102.27</a> (main node)<br>
<br>This works fine if
I remove the main node and add another node instead.<br><br>this is the
error.<br>////////////////////////////////////////////////////////////////////////////////////<br>Fatal
error in MPI_Finalize: Other MPI error, error
stack:<br>MPI_Finalize(255)............: MPI_Finalize
failed<br>MPI_Finalize(154)............:<br>MPID_Finalize(94)............:<br>MPI_Barrier(406).............:
MPI_Barrier(comm=0x44000002)
failed<br>MPIR_Barrier(77).............:<br>MPIC_Sendrecv(120)...........:<br>MPID_Isend(103)..............:
failure occurred while attempting to send an eage<br>r
message<br>MPIDI_CH3_iSend(168).........:<br>MPIDI_CH3I_Sock_connect(1191):
[ch3:sock] rank 1 unable to connect to rank 2 usi<br>ng business card
<port=1179 description=cse-365237834578 ifname=<a href="http://10.8.102.27" target="_blank">10.8.102.27</a> shm_<br>host=cse-365237834578
shm_queue=376D692D-A683-4917-BF58-13BD35D071E8 shm_pid=284<br>0
><br>MPIDU_Sock_post_connect(1228): unable to connect to cse-365237834578 on
port 117<br>9, exhausted all endpoints (errno
-1)<br>MPIDU_Sock_post_connect(1244): gethostbyname failed, The requested name
is valid<br> and was found in the database, but it does not have the
correct associated data<br> being resolved for. (errno 11004)<br>job
aborted:<br>rank: node: exit code[: error message]<br>0: <a href="http://10.8.102.28" target="_blank">10.8.102.28</a>: 1<br>1: <a href="http://10.8.102.30" target="_blank">10.8.102.30</a>: 1: Fatal error in MPI_Finalize: Other
MPI error, error stack:<br>MPI_Finalize(255)............: MPI_Finalize
failed<br>MPI_Finalize(154)............:<br>MPID_Finalize(94)............:<br>MPI_Barrier(406).............:
MPI_Barrier(comm=0x44000002)
failed<br>MPIR_Barrier(77).............:<br>MPIC_Sendrecv(120)...........:<br>MPID_Isend(103)..............:
failure occurred while attempting to send an eage<br>r
message<br>MPIDI_CH3_iSend(168).........:<br>MPIDI_CH3I_Sock_connect(1191):
[ch3:sock] rank 1 unable to connect to rank 2 usi<br>ng business card
<port=1179 description=cse-365237834578 ifname=<a href="http://10.8.102.27" target="_blank">10.8.102.27</a> shm_<br>host=cse-365237834578
shm_queue=376D692D-A683-4917-BF58-13BD35D071E8 shm_pid=284<br>0
><br>MPIDU_Sock_post_connect(1228): unable to connect to cse-365237834578 on
port 117<br>9, exhausted all endpoints (errno
-1)<br>MPIDU_Sock_post_connect(1244): gethostbyname failed, The requested name
is valid<br> and was found in the database, but it does not have the
correct associated data<br> being resolved for. (errno 11004)<br>2: <a href="http://10.8.102.27" target="_blank">10.8.102.27</a>: 1<br></div></div></div></div>
</blockquote></div><br></div>