================================================================================================== mpiexec options: ---------------- Base path: /usr/local/mpich2/bin/ Launcher: (null) Debug level: 1 Enable X: -1 Global environment: ------------------- USER=andy_holland LOGNAME=andy_holland HOME=/home/andy_holland PATH=/opt/intel/idb/10.0.023/bin:/opt/intel/fc/10.0.023/bin:/usr/local/ncarg/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/vis_tools/pave:.:/usr/local/lib/ioapi-3.1/Linux2_x86_64ifort:/usr/local/IDV_2.3:/vis_tools/verdi_1.31_beta:/usr/local/mpich2/bin:/models/mims/spatial MAIL=/var/spool/mail/andy_holland SHELL=/bin/tcsh SSH_CLIENT=10.202.70.73 3157 22 SSH_CONNECTION=10.202.70.73 3157 10.51.10.40 22 SSH_TTY=/dev/pts/2 TERM=vt100 DISPLAY=localhost:10.0 SSH_AUTH_SOCK=/tmp/ssh-ATmHoz3419/agent.3419 HOSTTYPE=i386-linux VENDOR=intel OSTYPE=linux MACHTYPE=i386 SHLVL=1 PWD=/mnt/vg01/CMAQ/mpich_test GROUP=users HOST=s051rhlapp01 REMOTEHOST=10.202.70.73 LS_COLORS=no G_BROKEN_FILENAMES=1 SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass LANG=en_US.UTF-8 LESSOPEN=|/usr/bin/lesspipe.sh %s HOSTNAME=s051rhlapp01 INPUTRC=/etc/inputrc EDSS_BINDIR=/vis_tools/pave/Linux2_x86/bin/OPTIMIZE PAVE_COORD=2 33 45 -97 -97 40 F_UFMTENDIAN=big NCARG_ROOT=/usr/local/ncarg RIP_ROOT=/models/MM5V3/RIP SMK_HOME=/models/smoke SMOKE_EXE=Linux2_x86ifc PROJ_LIB=/models/mims/spatial/src/PROJ4.5/local/share/proj LD_LIBRARY_PATH=/opt/intel/fc/10.0.023/lib DYLD_LIBRARY_PATH=/opt/intel/fc/10.0.023/lib MANPATH=/opt/intel/idb/10.0.023/man:/opt/intel/fc/10.0.023/man:/opt/intel/fc/10.0.023/man:/usr/local/ncarg/man:/usr/kerberos/man:/usr/local/share/man:/usr/share/man/en:/usr/share/man:/usr/X11R6/man:/usr/man INTEL_LICENSE_FILE=/opt/intel/fc/10.0.023/licenses:/opt/intel/licenses:/home/andy_holland/intel/licenses:/Users/Shared/Library/Application Support/Intel/Licenses M3HOME=/mnt/vg01/CMAQ M3DATA=/mnt/vg01/CMAQ/data M3MODEL=/mnt/vg01/CMAQ/models M3LIB=/mnt/vg01/CMAQ/lib M3TOOLS=/mnt/vg01/CMAQ/tools Hydra internal environment: --------------------------- GFORTRAN_UNBUFFERED_PRECONNECTED=y Proxy information: ********************* Proxy ID: 1 ----------------- Proxy name: s051rhlapp01 Process count: 2 Start PID: 0 Proxy exec list: .................... Exec: simple_test; Process count: 2 Proxy ID: 2 ----------------- Proxy name: s051rhlapp02 Process count: 2 Start PID: 2 Proxy exec list: .................... Exec: simple_test; Process count: 2 ================================================================================================== [mpiexec@s051rhlapp01] Timeout set to -1 (-1 means infinite) [mpiexec@s051rhlapp01] Got a control port string of s051rhlapp01:39707 Proxy launch args: /usr/local/mpich2/bin/hydra_pmi_proxy --control-port s051rhlapp01:39707 --debug --demux poll --pgid 0 --proxy-id [mpiexec@s051rhlapp01] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 0: --version 1.3.2p1 --interface-env-name MPICH_INTERFACE_HOSTNAME --hostname s051rhlapp01 --global-core-count 4 --global-process-count 4 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_7826_0 --pmi-process-mapping (vector,(0,2,2)) --ckpoint-num -1 --global-inherited-env 45 'USER=andy_holland' 'LOGNAME=andy_holland' 'HOME=/home/andy_holland' 'PATH=/opt/intel/idb/10.0.023/bin:/opt/intel/fc/10.0.023/bin:/usr/local/ncarg/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/vis_tools/pave:.:/usr/local/lib/ioapi-3.1/Linux2_x86_64ifort:/usr/local/IDV_2.3:/vis_tools/verdi_1.31_beta:/usr/local/mpich2/bin:/models/mims/spatial' 'MAIL=/var/spool/mail/andy_holland' 'SHELL=/bin/tcsh' 'SSH_CLIENT=10.202.70.73 3157 22' 'SSH_CONNECTION=10.202.70.73 3157 10.51.10.40 22' 'SSH_TTY=/dev/pts/2' 'TERM=vt100' 'DISPLAY=localhost:10.0' 'SSH_AUTH_SOCK=/tmp/ssh-ATmHoz3419/agent.3419' 'HOSTTYPE=i386-linux' 'VENDOR=intel' 'OSTYPE=linux' 'MACHTYPE=i386' 'SHLVL=1' 'PWD=/mnt/vg01/CMAQ/mpich_test' 'GROUP=users' 'HOST=s051rhlapp01' 'REMOTEHOST=10.202.70.73' 'LS_COLORS=no' 'G_BROKEN_FILENAMES=1' 'SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass' 'LANG=en_US.UTF-8' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'HOSTNAME=s051rhlapp01' 'INPUTRC=/etc/inputrc' 'EDSS_BINDIR=/vis_tools/pave/Linux2_x86/bin/OPTIMIZE' 'PAVE_COORD=2 33 45 -97 -97 40' 'F_UFMTENDIAN=big' 'NCARG_ROOT=/usr/local/ncarg' 'RIP_ROOT=/models/MM5V3/RIP' 'SMK_HOME=/models/smoke' 'SMOKE_EXE=Linux2_x86ifc' 'PROJ_LIB=/models/mims/spatial/src/PROJ4.5/local/share/proj' 'LD_LIBRARY_PATH=/opt/intel/fc/10.0.023/lib' 'DYLD_LIBRARY_PATH=/opt/intel/fc/10.0.023/lib' 'MANPATH=/opt/intel/idb/10.0.023/man:/opt/intel/fc/10.0.023/man:/opt/intel/fc/10.0.023/man:/usr/local/ncarg/man:/usr/kerberos/man:/usr/local/share/man:/usr/share/man/en:/usr/share/man:/usr/X11R6/man:/usr/man' 'INTEL_LICENSE_FILE=/opt/intel/fc/10.0.023/licenses:/opt/intel/licenses:/home/andy_holland/intel/licenses:/Users/Shared/Library/Application Support/Intel/Licenses' 'M3HOME=/mnt/vg01/CMAQ' 'M3DATA=/mnt/vg01/CMAQ/data' 'M3MODEL=/mnt/vg01/CMAQ/models' 'M3LIB=/mnt/vg01/CMAQ/lib' 'M3TOOLS=/mnt/vg01/CMAQ/tools' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --start-pid 0 --proxy-core-count 2 --exec --exec-appnum 0 --exec-proc-count 2 --exec-local-env 0 --exec-wdir /mnt/vg01/CMAQ/mpich_test --exec-args 1 simple_test [mpiexec@s051rhlapp01] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 1: --version 1.3.2p1 --interface-env-name MPICH_INTERFACE_HOSTNAME --hostname s051rhlapp02 --global-core-count 4 --global-process-count 4 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_7826_0 --pmi-process-mapping (vector,(0,2,2)) --ckpoint-num -1 --global-inherited-env 45 'USER=andy_holland' 'LOGNAME=andy_holland' 'HOME=/home/andy_holland' 'PATH=/opt/intel/idb/10.0.023/bin:/opt/intel/fc/10.0.023/bin:/usr/local/ncarg/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/vis_tools/pave:.:/usr/local/lib/ioapi-3.1/Linux2_x86_64ifort:/usr/local/IDV_2.3:/vis_tools/verdi_1.31_beta:/usr/local/mpich2/bin:/models/mims/spatial' 'MAIL=/var/spool/mail/andy_holland' 'SHELL=/bin/tcsh' 'SSH_CLIENT=10.202.70.73 3157 22' 'SSH_CONNECTION=10.202.70.73 3157 10.51.10.40 22' 'SSH_TTY=/dev/pts/2' 'TERM=vt100' 'DISPLAY=localhost:10.0' 'SSH_AUTH_SOCK=/tmp/ssh-ATmHoz3419/agent.3419' 'HOSTTYPE=i386-linux' 'VENDOR=intel' 'OSTYPE=linux' 'MACHTYPE=i386' 'SHLVL=1' 'PWD=/mnt/vg01/CMAQ/mpich_test' 'GROUP=users' 'HOST=s051rhlapp01' 'REMOTEHOST=10.202.70.73' 'LS_COLORS=no' 'G_BROKEN_FILENAMES=1' 'SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass' 'LANG=en_US.UTF-8' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'HOSTNAME=s051rhlapp01' 'INPUTRC=/etc/inputrc' 'EDSS_BINDIR=/vis_tools/pave/Linux2_x86/bin/OPTIMIZE' 'PAVE_COORD=2 33 45 -97 -97 40' 'F_UFMTENDIAN=big' 'NCARG_ROOT=/usr/local/ncarg' 'RIP_ROOT=/models/MM5V3/RIP' 'SMK_HOME=/models/smoke' 'SMOKE_EXE=Linux2_x86ifc' 'PROJ_LIB=/models/mims/spatial/src/PROJ4.5/local/share/proj' 'LD_LIBRARY_PATH=/opt/intel/fc/10.0.023/lib' 'DYLD_LIBRARY_PATH=/opt/intel/fc/10.0.023/lib' 'MANPATH=/opt/intel/idb/10.0.023/man:/opt/intel/fc/10.0.023/man:/opt/intel/fc/10.0.023/man:/usr/local/ncarg/man:/usr/kerberos/man:/usr/local/share/man:/usr/share/man/en:/usr/share/man:/usr/X11R6/man:/usr/man' 'INTEL_LICENSE_FILE=/opt/intel/fc/10.0.023/licenses:/opt/intel/licenses:/home/andy_holland/intel/licenses:/Users/Shared/Library/Application Support/Intel/Licenses' 'M3HOME=/mnt/vg01/CMAQ' 'M3DATA=/mnt/vg01/CMAQ/data' 'M3MODEL=/mnt/vg01/CMAQ/models' 'M3LIB=/mnt/vg01/CMAQ/lib' 'M3TOOLS=/mnt/vg01/CMAQ/tools' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --start-pid 2 --proxy-core-count 2 --exec --exec-appnum 0 --exec-proc-count 2 --exec-local-env 0 --exec-wdir /mnt/vg01/CMAQ/mpich_test --exec-args 1 simple_test [mpiexec@s051rhlapp01] Launch arguments: /usr/local/mpich2/bin/hydra_pmi_proxy --control-port s051rhlapp01:39707 --debug --demux poll --pgid 0 --proxy-id 0 [mpiexec@s051rhlapp01] Launch arguments: /usr/bin/ssh -x s051rhlapp02 "/usr/local/mpich2/bin/hydra_pmi_proxy" --control-port s051rhlapp01:39707 --debug --demux poll --pgid 0 --proxy-id 1 [proxy:0:0@s051rhlapp01] got pmi command (from 0): init pmi_version=1 pmi_subversion=1 [proxy:0:0@s051rhlapp01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@s051rhlapp01] got pmi command (from 0): get_maxes [proxy:0:0@s051rhlapp01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@s051rhlapp01] got pmi command (from 0): get_appnum [proxy:0:0@s051rhlapp01] PMI response: cmd=appnum appnum=0 [proxy:0:0@s051rhlapp01] got pmi command (from 0): get_my_kvsname [proxy:0:0@s051rhlapp01] PMI response: cmd=my_kvsname kvsname=kvs_7826_0 [proxy:0:0@s051rhlapp01] got pmi command (from 0): get_my_kvsname [proxy:0:0@s051rhlapp01] PMI response: cmd=my_kvsname kvsname=kvs_7826_0 [proxy:0:0@s051rhlapp01] got pmi command (from 0): get kvsname=kvs_7826_0 key=PMI_process_mapping [proxy:0:0@s051rhlapp01] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,2,2)) [proxy:0:0@s051rhlapp01] got pmi command (from 0): put kvsname=kvs_7826_0 key=sharedFilename[0] value=/dev/shm/mpich_shar_tmp5U8PR7 [proxy:0:0@s051rhlapp01] we don't understand this command put; forwarding upstream [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=put kvsname=kvs_7826_0 key=sharedFilename[0] value=/dev/shm/mpich_shar_tmp5U8PR7 [mpiexec@s051rhlapp01] PMI response to fd 6 pid 0: cmd=put_result rc=0 msg=success [proxy:0:0@s051rhlapp01] we don't understand the response put_result; forwarding downstream [proxy:0:0@s051rhlapp01] got pmi command (from 0): barrier_in [proxy:0:0@s051rhlapp01] got pmi command (from 6): init pmi_version=1 pmi_subversion=1 [proxy:0:0@s051rhlapp01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@s051rhlapp01] got pmi command (from 6): get_maxes [proxy:0:0@s051rhlapp01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@s051rhlapp01] got pmi command (from 6): get_appnum [proxy:0:0@s051rhlapp01] PMI response: cmd=appnum appnum=0 [proxy:0:0@s051rhlapp01] got pmi command (from 6): get_my_kvsname [proxy:0:0@s051rhlapp01] PMI response: cmd=my_kvsname kvsname=kvs_7826_0 [proxy:0:0@s051rhlapp01] got pmi command (from 6): get_my_kvsname [proxy:0:0@s051rhlapp01] PMI response: cmd=my_kvsname kvsname=kvs_7826_0 [proxy:0:0@s051rhlapp01] got pmi command (from 6): get kvsname=kvs_7826_0 key=PMI_process_mapping [proxy:0:0@s051rhlapp01] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,2,2)) [proxy:0:0@s051rhlapp01] got pmi command (from 6): barrier_in [proxy:0:0@s051rhlapp01] forwarding command (cmd=barrier_in) upstream [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=barrier_in [proxy:0:1@s051rhlapp02] got pmi command (from 4): init pmi_version=1 pmi_subversion=1 [proxy:0:1@s051rhlapp02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@s051rhlapp02] got pmi command (from 5): init pmi_version=1 pmi_subversion=1 [proxy:0:1@s051rhlapp02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@s051rhlapp02] got pmi command (from 4): get_maxes [proxy:0:1@s051rhlapp02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@s051rhlapp02] got pmi command (from 5): get_maxes [proxy:0:1@s051rhlapp02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@s051rhlapp02] got pmi command (from 4): get_appnum [proxy:0:1@s051rhlapp02] PMI response: cmd=appnum appnum=0 [proxy:0:1@s051rhlapp02] got pmi command (from 5): get_appnum [proxy:0:1@s051rhlapp02] PMI response: cmd=appnum appnum=0 [proxy:0:1@s051rhlapp02] got pmi command (from 4): get_my_kvsname [proxy:0:1@s051rhlapp02] PMI response: cmd=my_kvsname kvsname=kvs_7826_0 [proxy:0:1@s051rhlapp02] got pmi command (from 5): get_my_kvsname [proxy:0:1@s051rhlapp02] PMI response: cmd=my_kvsname kvsname=kvs_7826_0 [proxy:0:1@s051rhlapp02] got pmi command (from 4): get_my_kvsname [proxy:0:1@s051rhlapp02] PMI response: cmd=my_kvsname kvsname=kvs_7826_0 [proxy:0:1@s051rhlapp02] got pmi command (from 5): get_my_kvsname [proxy:0:1@s051rhlapp02] PMI response: cmd=my_kvsname kvsname=kvs_7826_0 [proxy:0:1@s051rhlapp02] got pmi command (from 4): get kvsname=kvs_7826_0 key=PMI_process_mapping [proxy:0:1@s051rhlapp02] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,2,2)) [proxy:0:1@s051rhlapp02] got pmi command (from 5): get kvsname=kvs_7826_0 key=PMI_process_mapping [proxy:0:1@s051rhlapp02] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,2,2)) [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=put kvsname=kvs_7826_0 key=sharedFilename[2] value=/dev/shm/mpich_shar_tmpU9aE1X [mpiexec@s051rhlapp01] PMI response to fd 7 pid 4: cmd=put_result rc=0 msg=success [proxy:0:1@s051rhlapp02] got pmi command (from 5): barrier_in [proxy:0:1@s051rhlapp02] got pmi command (from 4): put kvsname=kvs_7826_0 key=sharedFilename[2] value=/dev/shm/mpich_shar_tmpU9aE1X [proxy:0:1@s051rhlapp02] we don't understand this command put; forwarding upstream [proxy:0:1@s051rhlapp02] we don't understand the response put_result; forwarding downstream [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@s051rhlapp01] PMI response to fd 6 pid 4: cmd=barrier_out [mpiexec@s051rhlapp01] PMI response to fd 7 pid 4: cmd=barrier_out [proxy:0:0@s051rhlapp01] PMI response: cmd=barrier_out [proxy:0:0@s051rhlapp01] PMI response: cmd=barrier_out [proxy:0:0@s051rhlapp01] got pmi command (from 6): get kvsname=kvs_7826_0 key=sharedFilename[0] [proxy:0:0@s051rhlapp01] forwarding command (cmd=get kvsname=kvs_7826_0 key=sharedFilename[0]) upstream [proxy:0:1@s051rhlapp02] got pmi command (from 4): barrier_in [proxy:0:1@s051rhlapp02] forwarding command (cmd=barrier_in) upstream [proxy:0:1@s051rhlapp02] PMI response: cmd=barrier_out [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_7826_0 key=sharedFilename[0] [mpiexec@s051rhlapp01] PMI response to fd 6 pid 6: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmp5U8PR7 [proxy:0:1@s051rhlapp02] PMI response: cmd=barrier_out [proxy:0:1@s051rhlapp02] got pmi command (from 5): get kvsname=kvs_7826_0 key=sharedFilename[2] [proxy:0:1@s051rhlapp02] forwarding command (cmd=get kvsname=kvs_7826_0 key=sharedFilename[2]) upstream [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_7826_0 key=sharedFilename[2] [mpiexec@s051rhlapp01] PMI response to fd 7 pid 5: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpU9aE1X [proxy:0:0@s051rhlapp01] we don't understand the response get_result; forwarding downstream [proxy:0:1@s051rhlapp02] we don't understand the response get_result; forwarding downstream [proxy:0:0@s051rhlapp01] got pmi command (from 0): put kvsname=kvs_7826_0 key=P0-businesscard value=description#s051rhlapp01$port#57416$ifname#127.0.0.1$ [proxy:0:0@s051rhlapp01] we don't understand this command put; forwarding upstream [proxy:0:0@s051rhlapp01] got pmi command (from 6): put kvsname=kvs_7826_0 key=P1-businesscard value=description#s051rhlapp01$port#45998$ifname#127.0.0.1$ [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=put kvsname=kvs_7826_0 key=P0-businesscard value=description#s051rhlapp01$port#57416$ifname#127.0.0.1$ [mpiexec@s051rhlapp01] PMI response to fd 6 pid 0: cmd=put_result rc=0 msg=success [proxy:0:0@s051rhlapp01] we don't understand this command put; forwarding upstream [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=put kvsname=kvs_7826_0 key=P1-businesscard value=description#s051rhlapp01$port#45998$ifname#127.0.0.1$ [mpiexec@s051rhlapp01] PMI response to fd 6 pid 6: cmd=put_result rc=0 msg=success [proxy:0:0@s051rhlapp01] we don't understand the response put_result; forwarding downstream [proxy:0:0@s051rhlapp01] we don't understand the response put_result; forwarding downstream [proxy:0:0@s051rhlapp01] got pmi command (from 0): barrier_in [proxy:0:0@s051rhlapp01] got pmi command (from 6): barrier_in [proxy:0:0@s051rhlapp01] forwarding command (cmd=barrier_in) upstream [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=put kvsname=kvs_7826_0 key=P2-businesscard value=description#s051rhlapp02$port#56787$ifname#127.0.0.1$ [mpiexec@s051rhlapp01] PMI response to fd 7 pid 4: cmd=put_result rc=0 msg=success [proxy:0:1@s051rhlapp02] got pmi command (from 4): put kvsname=kvs_7826_0 key=P2-businesscard value=description#s051rhlapp02$port#56787$ifname#127.0.0.1$ [proxy:0:1@s051rhlapp02] we don't understand this command put; forwarding upstream [proxy:0:1@s051rhlapp02] got pmi command (from 5): put kvsname=kvs_7826_0 key=P3-businesscard value=description#s051rhlapp02$port#46260$ifname#127.0.0.1$ [proxy:0:1@s051rhlapp02] we don't understand this command put; forwarding upstream [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=put kvsname=kvs_7826_0 key=P3-businesscard value=description#s051rhlapp02$port#46260$ifname#127.0.0.1$ [mpiexec@s051rhlapp01] PMI response to fd 7 pid 5: cmd=put_result rc=0 msg=success [proxy:0:1@s051rhlapp02] we don't understand the response put_result; forwarding downstream [proxy:0:1@s051rhlapp02] got pmi command (from 4): barrier_in [proxy:0:1@s051rhlapp02] we don't understand the response put_result; forwarding downstream [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@s051rhlapp01] PMI response to fd 6 pid 5: cmd=barrier_out [mpiexec@s051rhlapp01] PMI response to fd 7 pid 5: cmd=barrier_out [proxy:0:0@s051rhlapp01] PMI response: cmd=barrier_out [proxy:0:0@s051rhlapp01] PMI response: cmd=barrier_out [proxy:0:1@s051rhlapp02] got pmi command (from 5): barrier_in [proxy:0:1@s051rhlapp02] forwarding command (cmd=barrier_in) upstream [proxy:0:1@s051rhlapp02] PMI response: cmd=barrier_out [0] recv from 1 [1] send to 0 [1] recv from 0 [2] send to 0 [proxy:0:1@s051rhlapp02] PMI response: cmd=barrier_out [proxy:0:1@s051rhlapp02] got pmi command (from 4): get kvsname=kvs_7826_0 key=P0-businesscard [proxy:0:1@s051rhlapp02] forwarding command (cmd=get kvsname=kvs_7826_0 key=P0-businesscard) upstream [proxy:0:1@s051rhlapp02] got pmi command (from 5): get kvsname=kvs_7826_0 key=P0-businesscard [proxy:0:1@s051rhlapp02] forwarding command (cmd=get kvsname=kvs_7826_0 key=P0-businesscard) upstream [0] recv from 2 [3] send to 0 [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_7826_0 key=P0-businesscard [mpiexec@s051rhlapp01] PMI response to fd 7 pid 4: cmd=get_result rc=0 msg=success value=description#s051rhlapp01$port#57416$ifname#127.0.0.1$ [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_7826_0 key=P0-businesscard [mpiexec@s051rhlapp01] PMI response to fd 7 pid 5: cmd=get_result rc=0 msg=success value=description#s051rhlapp01$port#57416$ifname#127.0.0.1$ [proxy:0:1@s051rhlapp02] we don't understand the response get_result; forwarding downstream [proxy:0:1@s051rhlapp02] we don't understand the response get_result; forwarding downstream [2] Fatal error in MPI_Send: Other MPI error, error stack: [2] MPI_Send(173)..............: MPI_Send(buf=(nil), count=0, MPI_INT, dest=0, tag=0, MPI_COMM_WORLD) failed [2] MPID_nem_tcp_connpoll(1811): Communication error with rank 0: [3] Fatal error in MPI_Send: Other MPI error, error stack: [3] MPI_Send(173)..............: MPI_Send(buf=(nil), count=0, MPI_INT, dest=0, tag=0, MPI_COMM_WORLD) failed [3] MPID_nem_tcp_connpoll(1811): Communication error with rank 0: [mpiexec@s051rhlapp01] ONE OF THE PROCESSES TERMINATED BADLY: CLEANING UP [proxy:0:0@s051rhlapp01] HYD_pmcd_pmip_control_cmd_cb (/usr/local/mpich2-1.3.2p1/src/pm/hydra/pm/pmiserv/pmip_cb.c:868): assert (!closed) failed [proxy:0:0@s051rhlapp01] HYDT_dmxu_poll_wait_for_event (/usr/local/mpich2-1.3.2p1/src/pm/hydra/tools/demux/demux_poll.c:77): callback returned error status [proxy:0:0@s051rhlapp01] main (/usr/local/mpich2-1.3.2p1/src/pm/hydra/pm/pmiserv/pmip.c:208): demux engine error waiting for event [mpiexec@s051rhlapp01] HYDT_bscu_wait_for_completion (/usr/local/mpich2-1.3.2p1/src/pm/hydra/tools/bootstrap/utils/bscu_wait.c:70): one of the processes terminated badly; aborting [mpiexec@s051rhlapp01] HYDT_bsci_wait_for_completion (/usr/local/mpich2-1.3.2p1/src/pm/hydra/tools/bootstrap/src/bsci_wait.c:18): launcher returned error waiting for completion [mpiexec@s051rhlapp01] HYD_pmci_wait_for_completion (/usr/local/mpich2-1.3.2p1/src/pm/hydra/pm/pmiserv/pmiserv_pmci.c:216): launcher returned error waiting for completion [mpiexec@s051rhlapp01] main (/usr/local/mpich2-1.3.2p1/src/pm/hydra/ui/mpich/mpiexec.c:404): process manager error waiting for completion