<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<HTML><HEAD>
<META http-equiv=Content-Type content="text/html; charset=us-ascii">
<META content="MSHTML 6.00.6000.16809" name=GENERATOR></HEAD>
<BODY>
<DIV dir=ltr align=left><SPAN class=447001118-01042009><FONT face=Arial
color=#0000ff size=2>You need to use the mpicc and mpiexec from the MPICH2
installation that was built to use MPD.</FONT></SPAN></DIV>
<DIV dir=ltr align=left><SPAN class=447001118-01042009><FONT face=Arial
color=#0000ff size=2></FONT></SPAN> </DIV>
<DIV dir=ltr align=left><SPAN class=447001118-01042009><FONT face=Arial
color=#0000ff size=2>Rajeev</FONT></SPAN></DIV>
<DIV dir=ltr align=left><SPAN class=447001118-01042009><FONT face=Arial
color=#0000ff size=2></FONT></SPAN> </DIV><BR>
<BLOCKQUOTE
style="PADDING-LEFT: 5px; MARGIN-LEFT: 5px; BORDER-LEFT: #0000ff 2px solid; MARGIN-RIGHT: 0px">
<DIV class=OutlookMessageHeader lang=en-us dir=ltr align=left>
<HR tabIndex=-1>
<FONT face=Tahoma size=2><B>From:</B> mpich-discuss-bounces@mcs.anl.gov
[mailto:mpich-discuss-bounces@mcs.anl.gov] <B>On Behalf Of </B>Gauri
Kulkarni<BR><B>Sent:</B> Wednesday, April 01, 2009 8:56 AM<BR><B>To:</B>
mpich-discuss@mcs.anl.gov<BR><B>Subject:</B> [mpich-discuss] What do these
errors mean?<BR></FONT><BR></DIV>
<DIV></DIV>Hi,<BR><BR>I am using MPICH2-1.0.7 (I cannot go to 1.0.8 right now)
which is configured to be used with SLURM. That is, the process manager is
SLURM and NOT mpd. When I submit my job through bsub (bsub [options] srun
./helloworld.mympi), it works perfectly. I cannot use mpiexec as it is not the
one spawning jobs, I must use srun. My question is, can I still use mpiexec
from command-line? Well.. I tried. Here is the output:<BR><BR>mpiexec -n 2
./helloworld.mympi<BR>mpiexec_n53: cannot connect to local mpd
(/tmp/mpd2.console_cgaurik); possible causes:<BR> 1. no mpd is running
on this host<BR> 2. an mpd is running but was started without a
"console" (-n option)<BR>In case 1, you can start an mpd on this host
with:<BR> mpd &<BR>and you will be able to run jobs just
on this host.<BR>For more details on starting mpds on a set of hosts,
see<BR>the MPICH2 Installation Guide.<BR><BR>Then:<BR><BR>mpd &<BR>mpiexec
-n 2 ./helloworld.mympi<BR><BR><B>Hello world! I'm 0 of 2 on n53</B><BR>Fatal
error in MPI_Finalize: Other MPI error, error
stack:<BR>MPI_Finalize(255)...................: MPI_Finalize
failed<BR>MPI_Finalize(154)...................:<BR>MPID_Finalize(94)...................:<BR>MPI_Barrier(406)....................:
MPI_Barrier(comm=0x44000002)
failed<BR>MPIR_Barrier(77)....................:<BR>MPIC_Sendrecv(120)..................:<BR>MPID_Isend(103).....................:
failure occurred while attempting to send an eager
message<BR>MPIDI_CH3_iSend(172)................:<BR>MPIDI_CH3I_VC_post_sockconnect(1090):<BR>MPIDI_PG_SetConnInfo(615)...........:
PMI_KVS_Get failedFatal error in MPI_Finalize: Other MPI error, error
stack:<BR>MPI_Finalize(255)...................: MPI_Finalize
failed<BR>MPI_Finalize(154)...................:<BR>MPID_Finalize(94)...................:<BR>MPI_Barrier(406)....................:
MPI_Barrier(comm=0x44000002)
failed<BR>MPIR_Barrier(77)....................:<BR>MPIC_Sendrecv(120)..................:<BR>MPID_Isend(103).....................:
failure occurred while attempting to send an eager message<BR>MP<B>Hello
world! I'm 1 of 2 on
n53</B><BR>IDI_CH3_iSend(172)................:<BR>MPIDI_CH3I_VC_post_sockconnect(1090):<BR>MPIDI_PG_SetConnInfo(615)...........:
PMI_KVS_Get failed<BR><BR>The bold text shows that the job gets executed but
there is a lot of other garbage. It seems to me that I can either configure
MPICH2 to be used with cluster job scheduler or to be used from command line.
I cannot have both.<BR><BR>Am I
right?<BR><BR>-Gauri.<BR>----------<BR></BLOCKQUOTE></BODY></HTML>