<html>
<head>
<style>
.hmmessage P
{
margin:0px;
padding:0px
}
body.hmmessage
{
FONT-SIZE: 10pt;
FONT-FAMILY:Tahoma
}
</style>
</head>
<body class='hmmessage'><div style="text-align: left;">Jayesh,<br><br>Thanks a lot for your info.<br><br>After a lot of trys, I was finally able to run cpi.exe across multiple hosts. It seems to have something to with the setting in my machine file. The following is the detail.<br><br>I use the following command to run cpi.exe:<br><br>mpiexec -n 2 -machinefile config.txt -channel ssm(or others) cpi.exe<br><br>a) If I have a config.txt file like the following:<br> host1name:1 -ifhn host1_ipaddress<br> host2name:2 -ifhn host2_ipaddress<br> Everything works fine(for all channels).<br>b) If I have a config.txt like the following:<br> host1name:1<br> host2:name:2<br> then, for sock channel, it hangs mpi_bcast. For auto and ssm, I got the following error message:<br> <br><div><font face="Arial" size="2">C:\public\bin>mpiexec -n 2 -machinefile
Config.txt -channel ssm <span class="EC_588480822-07092007"> or auto
</span>C:\public\bin\cpi.exe<br>Enter the number of intervals: (0 quits)
100</font></div>
<div> </div>
<div><font face="Arial" size="2">job aborted:<br>rank: node: exit code[: error
message]<br>0: B0016350B383E: 1: Fatal error in MPI_Bcast: Other MPI error,
error stack:<br>MPI_Bcast(784).................: MPI_Bcast(buf=0012FE88,
count=1, MPI_INT, root=0, MPI_COMM_WORLD)
fai<br>led<br>MPIR_Bcast(230)................:<br>MPIC_Send(36)..................:<br>MPIDI_EagerContigSend(146).....:
failure occurred while attempting to send an eager
message<br>MPIDI_CH3_iStartMsgv(224)......:<br>MPIDI_CH3I_VC_post_connect(555):
[ch3:sock] rank 0 unable to connect to rank 1 using business card
<po<br>rt=3872 description=B001279FD7C60.corp.bankofamerica.com
ifname=171.188.32.154 shm_host=B001279FD7C60.<br>corp.bankofamerica.com
shm_queue=39E4F281-FCC0-4f4a-B540-EDC8D517F065 shm_pid=2484
><br>MPIDU_Sock_post_connect(1228)..: unable to connect to
B001279FD7C60.corp.bankofamerica.com on port 387<br>2, exhausted all endpoints
(errno -1)<br>MPIDU_Sock_post_connect(1244)..: gethostbyname failed, The
requested name is valid and was found in th<br>e database, but it does not have
the correct associated data being resolved for. (errno 11004)<br>1:
B001279FD7C60: 1</font></div><br>I know this has something to do with my network setting, but just can't figure out why.<br><br>Any ideas?<br><br>Thanks<br><br>Richard<br><br></div><br><br><br><blockquote><hr id="EC_stopSpelling">From: jayesh@mcs.anl.gov<br>To: xs_li@hotmail.com<br>CC: mpich-discuss@mcs.anl.gov<br>Subject: RE: [MPICH] MPI_Bcast hangs in Windows XP<br>Date: Thu, 6 Sep 2007 09:25:36 -0500<br><br>
<meta http-equiv="Content-Type" content="text/html; charset=unicode">
<meta name="Generator" content="Microsoft SafeHTML">
<style>
.ExternalClass .EC_hmmessage P
{padding-right:0px;padding-left:0px;padding-bottom:0px;padding-top:0px;}
.ExternalClass EC_BODY.hmmessage
{font-size:10pt;font-family:Tahoma;}
</style>
<div dir="ltr" align="left"><font color="#0000ff" face="Arial"><span class="EC_837321314-06092007">Hi,</span></font></div>
<div dir="ltr" align="left"><font color="#0000ff" face="Arial"><span class="EC_837321314-06092007"> The process manager (smpd) is responsible for
launching the MPI processes on the various machines and providing an MPI
processes information on how to communicate with other MPI
processes.</span></font></div>
<div dir="ltr" align="left"><font color="#0000ff" face="Arial"><span class="EC_837321314-06092007"> The SMPD process manager listens (default
case) on port 8676 and then asks the client PM to connect to a new port. So
you should allow SMPD process manager (smpd.exe --- installed as a service in
windows) to communicate at all ports (This is the easiest way. However you can
also restrict the port range used by SMPD. Refer to the windows devloper's guide
available at <a href="http://www-unix.mcs.anl.gov/mpi/mpich/" target="_blank">http://www-unix.mcs.anl.gov/mpi/mpich/</a> for
details.)</span></font></div>
<div><font color="#0000ff" face="Arial"><span class="EC_837321314-06092007"> Make
sure that no firewall (1. Running on the individual machines 2. OR
on the network, filtering the traffic btw the machines) is preventing the
process managers & the MPI procs on the individual machines from
contacting each other.</span></font></div>
<div><font color="#0000ff" face="Arial"><span class="EC_837321314-06092007"></span></font> </div>
<div><font color="#0000ff" face="Arial"><span class="EC_837321314-06092007">(Note: Since
you do not know what changed in your network, it might help if you try analyzing
the network packets sent btw the machines using a packet sniffer like
Ethereal.)</span></font></div>
<div><font color="#0000ff" face="Arial"><span class="EC_837321314-06092007"></span></font> </div>
<div><span class="EC_837321314-06092007"><font color="#0000ff" face="Arial">Regards,</font></span></div>
<div><span class="EC_837321314-06092007"><font color="#0000ff" face="Arial">Jayesh</font></span></div><br>
<div class="EC_OutlookMessageHeader" dir="ltr" align="left" lang="en-us">
<hr>
<font face="Tahoma"><b>From:</b> owner-mpich-discuss@mcs.anl.gov
[mailto:owner-mpich-discuss@mcs.anl.gov] <b>On Behalf Of </b>Richard
Li<br><b>Sent:</b> Wednesday, September 05, 2007 8:21 PM<br><b>To:</b>
mpich-discuss@mcs.anl.gov<br><b>Subject:</b> [MPICH] MPI_Bcast hangs in Windows
XP<br></font><br></div>
<div></div>
<div style="text-align: left;"><font face="Arial">
<div><span class="EC_EC_618441420-17082007"><font face="Arial">Hi
there,</font></span></div>
<div><span class="EC_EC_618441420-17082007"></span> </div>
<div><span class="EC_EC_618441420-17082007"><font face="Arial">I am writing an
application in Windows XP/VC8 and am having problem with MPI_Bcast(). I am
working in corporate environment and suspect it may have something to do with
our security policies, however, I don't know exact which low-level operations
failed . </font></span></div>
<div><span class="EC_EC_618441420-17082007"></span> </div>
<div><span class="EC_EC_618441420-17082007"><font face="Arial">Here is the symptom: my
application (as well as cpi.exe example) works fine as long as there is only one
machine in the machine file, whether its local machine or remote does not
matter. It hangs at MPI_Bcast() when I have more than one machine in
MPI_COMM_WORLD. </font></span><font face="Arial"><span class="EC_EC_618441420-17082007"><font face="Arial">I am using
</font></span><font face="Arial">mpich2-1.0.5p2-win32-ia32.msi.</font></font></div>
<div><span class="EC_EC_618441420-17082007"></span> </div>
<div><span class="EC_EC_618441420-17082007"><font face="Arial">The same application
worked perfectly a year ago and there have been many security policy changes
since that time(as usual, all policies reduce our freedom). My question is that
what's the communication mechanism used in inter-node communication. I tried
nothing, auto, sock, ssm as communication channels and had no
luck.</font></span></div>
<div><span class="EC_EC_618441420-17082007"></span> </div>
<div><span class="EC_EC_618441420-17082007"><font face="Arial">Thanks for your
help.<br><br>Richard<br></font></span></div></font></div><br>
<hr>
Discover the new Windows Vista <a href="http://search.msn.com/results.aspx?q=windows+vista&mkt=en-US&form=QBRE" target="_blank">Learn more!</a>
</blockquote><br /><hr />Make your little one a shining star! <a href='http://www.reallivemoms.com?ocid=TXT_TAGHM&loc=us' target='_new'>Shine on!</a></body>
</html>