/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/bin/mpiexec ================================================================================================== mpiexec options: ---------------- Base path: /opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/bin/ Bootstrap server: (null) Debug level: 1 Enable X: -1 Global environment: ------------------- MODULE_VERSION_STACK=3.2.6 MANPATH=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/man:/opt/compilers/gcc-4.4.0/man:/usr/man G77=gfortran HOSTNAME=hpn01 SHELL=/bin/sh HISTSIZE=1000 PBS_JOBNAME=pacetest PBS_ENVIRONMENT=PBS_BATCH OLDPWD=/home/acsl/jevans PBS_O_WORKDIR=/home/acsl/jevans/pace2-0test/mpich2-1.3.2/nas/ep USER=jevans PBS_TASKNUM=1 LD_LIBRARY_PATH=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/lib:/opt/compilers/gcc-4.4.0/lib:/opt/compilers/gcc-4.4.0/lib64:/opt/compilers/gmp-4.3.1/lib:/opt/compilers/mpfr-2.4.1/lib LS_COLORS=(null) PBS_O_HOME=/home/acsl/jevans PBS_MOMPORT=15003 PBS_O_QUEUE=acslq MODULE_VERSION=3.2.6 PATH=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/bin:/opt/compilers/gcc-4.4.0/bin:/usr/kerberos/bin:/bin:/usr/bin PBS_O_LOGNAME=jevans MAIL=/var/spool/mail/jevans PBS_O_LANG=en_US.UTF-8 PBS_JOBCOOKIE=3BEBCAEDC95513D1710BDC2CE19CC088 F90=mpif90 PWD=/home/acsl/jevans/pace2-0test/mpich2-1.3.2/nas/ep INPUTRC=/etc/inputrc _LMFILES_=/opt/admintools/Modules/3.2.6/modulefiles/gcc/4.4.0:/opt/admintools/Modules/3.2.6/modulefiles/mpich2-1.3.2/64/nemesis-gcc-4.4.0/4.4.0 LANG=en_US.UTF-8 PBS_NODENUM=0 MODULEPATH=/opt/admintools/Modules/versions:/opt/admintools/Modules/$MODULE_VERSION/modulefiles:/opt/admintools/Modules/modulefiles: LOADEDMODULES=gcc/4.4.0:mpich2-1.3.2/64/nemesis-gcc-4.4.0/4.4.0 F77=mpif77 PBS_O_SHELL=/bin/bash PBS_JOBID=3773.acslhpc.acsl CXX=mpicxx ENVIRONMENT=BATCH HOME=/home/acsl/jevans SHLVL=2 PBS_O_HOST=acslhpc.acsl GCC=/opt/compilers/gcc-4.4.0/bin/gcc FC=mpif77 PBS_VNODENUM=0 LOGNAME=jevans PBS_QUEUE=acslq MPI_HOME=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0 MODULESHOME=/opt/admintools/Modules/3.2.6 PBS_O_MAIL=/var/spool/mail/jevans LESSOPEN=|/usr/bin/lesspipe.sh %s CC=mpicc PBS_NODEFILE=/var/spool/torque/aux//3773.acslhpc.acsl G_BROKEN_FILENAMES=1 PBS_O_PATH=/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/opt/c3-4/:/home/acsl/jevans/bin:/opt/tools/hwloc-1.0.2/bin:/opt/admintools/torque/bin:/opt/admintools/maui/bin:/opt/admintools/c3-4: _=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/bin/mpiexec Proxy information: ********************* Proxy ID: 1 ----------------- Proxy name: hpn01 Process count: 7 Start PID: 0 Proxy exec list: .................... Exec: ./pace; Process count: 7 Proxy ID: 2 ----------------- Proxy name: hpn02 Process count: 8 Start PID: 7 Proxy exec list: .................... Exec: ./pace; Process count: 8 Proxy ID: 3 ----------------- Proxy name: hpn03 Process count: 8 Start PID: 15 Proxy exec list: .................... Exec: ./pace; Process count: 8 Proxy ID: 4 ----------------- Proxy name: hpn04 Process count: 7 Start PID: 23 Proxy exec list: .................... Exec: ./pace; Process count: 7 ================================================================================================== [mpiexec@hpn01] Timeout set to -1 (-1 means infinite) [mpiexec@hpn01] Got a control port string of hpn01:40335 Proxy launch args: /opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/bin/hydra_pmi_proxy --control-port hpn01:40335 --debug --demux poll --pgid 0 --enable-stdin 1 --proxy-id [mpiexec@hpn01] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 0: --version 1.3 --interface-env-name MPICH_INTERFACE_HOSTNAME --hostname hpn01 --global-core-count 30 --global-process-count 30 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_22216_0 --pmi-process-mapping (vector,(0,1,7),(1,2,8),(2,1,7)) --local-binding user:1,2,3,4,5,6,7 --ckpoint-num -1 --global-inherited-env 53 'MODULE_VERSION_STACK=3.2.6' 'MANPATH=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/man:/opt/compilers/gcc-4.4.0/man:/usr/man' 'G77=gfortran' 'HOSTNAME=hpn01' 'SHELL=/bin/sh' 'HISTSIZE=1000' 'PBS_JOBNAME=pacetest' 'PBS_ENVIRONMENT=PBS_BATCH' 'OLDPWD=/home/acsl/jevans' 'PBS_O_WORKDIR=/home/acsl/jevans/pace2-0test/mpich2-1.3.2/nas/ep' 'USER=jevans' 'PBS_TASKNUM=1' 'LD_LIBRARY_PATH=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/lib:/opt/compilers/gcc-4.4.0/lib:/opt/compilers/gcc-4.4.0/lib64:/opt/compilers/gmp-4.3.1/lib:/opt/compilers/mpfr-2.4.1/lib' 'LS_COLORS=' 'PBS_O_HOME=/home/acsl/jevans' 'PBS_MOMPORT=15003' 'PBS_O_QUEUE=acslq' 'MODULE_VERSION=3.2.6' 'PATH=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/bin:/opt/compilers/gcc-4.4.0/bin:/usr/kerberos/bin:/bin:/usr/bin' 'PBS_O_LOGNAME=jevans' 'MAIL=/var/spool/mail/jevans' 'PBS_O_LANG=en_US.UTF-8' 'PBS_JOBCOOKIE=3BEBCAEDC95513D1710BDC2CE19CC088' 'F90=mpif90' 'PWD=/home/acsl/jevans/pace2-0test/mpich2-1.3.2/nas/ep' 'INPUTRC=/etc/inputrc' '_LMFILES_=/opt/admintools/Modules/3.2.6/modulefiles/gcc/4.4.0:/opt/admintools/Modules/3.2.6/modulefiles/mpich2-1.3.2/64/nemesis-gcc-4.4.0/4.4.0' 'LANG=en_US.UTF-8' 'PBS_NODENUM=0' 'MODULEPATH=/opt/admintools/Modules/versions:/opt/admintools/Modules/$MODULE_VERSION/modulefiles:/opt/admintools/Modules/modulefiles:' 'LOADEDMODULES=gcc/4.4.0:mpich2-1.3.2/64/nemesis-gcc-4.4.0/4.4.0' 'F77=mpif77' 'PBS_O_SHELL=/bin/bash' 'PBS_JOBID=3773.acslhpc.acsl' 'CXX=mpicxx' 'ENVIRONMENT=BATCH' 'HOME=/home/acsl/jevans' 'SHLVL=2' 'PBS_O_HOST=acslhpc.acsl' 'GCC=/opt/compilers/gcc-4.4.0/bin/gcc' 'FC=mpif77' 'PBS_VNODENUM=0' 'LOGNAME=jevans' 'PBS_QUEUE=acslq' 'MPI_HOME=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0' 'MODULESHOME=/opt/admintools/Modules/3.2.6' 'PBS_O_MAIL=/var/spool/mail/jevans' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'CC=mpicc' 'PBS_NODEFILE=/var/spool/torque/aux//3773.acslhpc.acsl' 'G_BROKEN_FILENAMES=1' 'PBS_O_PATH=/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/opt/c3-4/:/home/acsl/jevans/bin:/opt/tools/hwloc-1.0.2/bin:/opt/admintools/torque/bin:/opt/admintools/maui/bin:/opt/admintools/c3-4:' '_=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/bin/mpiexec' --global-user-env 0 --global-system-env 0 --start-pid 0 --proxy-core-count 7 --exec --exec-appnum 0 --exec-proc-count 7 --exec-local-env 0 --exec-wdir /home/acsl/jevans/pace2-0test/mpich2-1.3.2/nas/ep --exec-args 1 ./pace [mpiexec@hpn01] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 1: --version 1.3 --interface-env-name MPICH_INTERFACE_HOSTNAME --hostname hpn02 --global-core-count 30 --global-process-count 30 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_22216_0 --pmi-process-mapping (vector,(0,1,7),(1,2,8),(2,1,7)) --local-binding user:0,1,2,3,4,5,6,7 --ckpoint-num -1 --global-inherited-env 53 'MODULE_VERSION_STACK=3.2.6' 'MANPATH=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/man:/opt/compilers/gcc-4.4.0/man:/usr/man' 'G77=gfortran' 'HOSTNAME=hpn01' 'SHELL=/bin/sh' 'HISTSIZE=1000' 'PBS_JOBNAME=pacetest' 'PBS_ENVIRONMENT=PBS_BATCH' 'OLDPWD=/home/acsl/jevans' 'PBS_O_WORKDIR=/home/acsl/jevans/pace2-0test/mpich2-1.3.2/nas/ep' 'USER=jevans' 'PBS_TASKNUM=1' 'LD_LIBRARY_PATH=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/lib:/opt/compilers/gcc-4.4.0/lib:/opt/compilers/gcc-4.4.0/lib64:/opt/compilers/gmp-4.3.1/lib:/opt/compilers/mpfr-2.4.1/lib' 'LS_COLORS=' 'PBS_O_HOME=/home/acsl/jevans' 'PBS_MOMPORT=15003' 'PBS_O_QUEUE=acslq' 'MODULE_VERSION=3.2.6' 'PATH=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/bin:/opt/compilers/gcc-4.4.0/bin:/usr/kerberos/bin:/bin:/usr/bin' 'PBS_O_LOGNAME=jevans' 'MAIL=/var/spool/mail/jevans' 'PBS_O_LANG=en_US.UTF-8' 'PBS_JOBCOOKIE=3BEBCAEDC95513D1710BDC2CE19CC088' 'F90=mpif90' 'PWD=/home/acsl/jevans/pace2-0test/mpich2-1.3.2/nas/ep' 'INPUTRC=/etc/inputrc' '_LMFILES_=/opt/admintools/Modules/3.2.6/modulefiles/gcc/4.4.0:/opt/admintools/Modules/3.2.6/modulefiles/mpich2-1.3.2/64/nemesis-gcc-4.4.0/4.4.0' 'LANG=en_US.UTF-8' 'PBS_NODENUM=0' 'MODULEPATH=/opt/admintools/Modules/versions:/opt/admintools/Modules/$MODULE_VERSION/modulefiles:/opt/admintools/Modules/modulefiles:' 'LOADEDMODULES=gcc/4.4.0:mpich2-1.3.2/64/nemesis-gcc-4.4.0/4.4.0' 'F77=mpif77' 'PBS_O_SHELL=/bin/bash' 'PBS_JOBID=3773.acslhpc.acsl' 'CXX=mpicxx' 'ENVIRONMENT=BATCH' 'HOME=/home/acsl/jevans' 'SHLVL=2' 'PBS_O_HOST=acslhpc.acsl' 'GCC=/opt/compilers/gcc-4.4.0/bin/gcc' 'FC=mpif77' 'PBS_VNODENUM=0' 'LOGNAME=jevans' 'PBS_QUEUE=acslq' 'MPI_HOME=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0' 'MODULESHOME=/opt/admintools/Modules/3.2.6' 'PBS_O_MAIL=/var/spool/mail/jevans' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'CC=mpicc' 'PBS_NODEFILE=/var/spool/torque/aux//3773.acslhpc.acsl' 'G_BROKEN_FILENAMES=1' 'PBS_O_PATH=/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/opt/c3-4/:/home/acsl/jevans/bin:/opt/tools/hwloc-1.0.2/bin:/opt/admintools/torque/bin:/opt/admintools/maui/bin:/opt/admintools/c3-4:' '_=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/bin/mpiexec' --global-user-env 0 --global-system-env 0 --start-pid 7 --proxy-core-count 8 --exec --exec-appnum 0 --exec-proc-count 8 --exec-local-env 0 --exec-wdir /home/acsl/jevans/pace2-0test/mpich2-1.3.2/nas/ep --exec-args 1 ./pace [mpiexec@hpn01] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 2: --version 1.3 --interface-env-name MPICH_INTERFACE_HOSTNAME --hostname hpn03 --global-core-count 30 --global-process-count 30 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_22216_0 --pmi-process-mapping (vector,(0,1,7),(1,2,8),(2,1,7)) --local-binding user:0,1,2,3,4,5,6,7 --ckpoint-num -1 --global-inherited-env 53 'MODULE_VERSION_STACK=3.2.6' 'MANPATH=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/man:/opt/compilers/gcc-4.4.0/man:/usr/man' 'G77=gfortran' 'HOSTNAME=hpn01' 'SHELL=/bin/sh' 'HISTSIZE=1000' 'PBS_JOBNAME=pacetest' 'PBS_ENVIRONMENT=PBS_BATCH' 'OLDPWD=/home/acsl/jevans' 'PBS_O_WORKDIR=/home/acsl/jevans/pace2-0test/mpich2-1.3.2/nas/ep' 'USER=jevans' 'PBS_TASKNUM=1' 'LD_LIBRARY_PATH=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/lib:/opt/compilers/gcc-4.4.0/lib:/opt/compilers/gcc-4.4.0/lib64:/opt/compilers/gmp-4.3.1/lib:/opt/compilers/mpfr-2.4.1/lib' 'LS_COLORS=' 'PBS_O_HOME=/home/acsl/jevans' 'PBS_MOMPORT=15003' 'PBS_O_QUEUE=acslq' 'MODULE_VERSION=3.2.6' 'PATH=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/bin:/opt/compilers/gcc-4.4.0/bin:/usr/kerberos/bin:/bin:/usr/bin' 'PBS_O_LOGNAME=jevans' 'MAIL=/var/spool/mail/jevans' 'PBS_O_LANG=en_US.UTF-8' 'PBS_JOBCOOKIE=3BEBCAEDC95513D1710BDC2CE19CC088' 'F90=mpif90' 'PWD=/home/acsl/jevans/pace2-0test/mpich2-1.3.2/nas/ep' 'INPUTRC=/etc/inputrc' '_LMFILES_=/opt/admintools/Modules/3.2.6/modulefiles/gcc/4.4.0:/opt/admintools/Modules/3.2.6/modulefiles/mpich2-1.3.2/64/nemesis-gcc-4.4.0/4.4.0' 'LANG=en_US.UTF-8' 'PBS_NODENUM=0' 'MODULEPATH=/opt/admintools/Modules/versions:/opt/admintools/Modules/$MODULE_VERSION/modulefiles:/opt/admintools/Modules/modulefiles:' 'LOADEDMODULES=gcc/4.4.0:mpich2-1.3.2/64/nemesis-gcc-4.4.0/4.4.0' 'F77=mpif77' 'PBS_O_SHELL=/bin/bash' 'PBS_JOBID=3773.acslhpc.acsl' 'CXX=mpicxx' 'ENVIRONMENT=BATCH' 'HOME=/home/acsl/jevans' 'SHLVL=2' 'PBS_O_HOST=acslhpc.acsl' 'GCC=/opt/compilers/gcc-4.4.0/bin/gcc' 'FC=mpif77' 'PBS_VNODENUM=0' 'LOGNAME=jevans' 'PBS_QUEUE=acslq' 'MPI_HOME=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0' 'MODULESHOME=/opt/admintools/Modules/3.2.6' 'PBS_O_MAIL=/var/spool/mail/jevans' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'CC=mpicc' 'PBS_NODEFILE=/var/spool/torque/aux//3773.acslhpc.acsl' 'G_BROKEN_FILENAMES=1' 'PBS_O_PATH=/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/opt/c3-4/:/home/acsl/jevans/bin:/opt/tools/hwloc-1.0.2/bin:/opt/admintools/torque/bin:/opt/admintools/maui/bin:/opt/admintools/c3-4:' '_=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/bin/mpiexec' --global-user-env 0 --global-system-env 0 --start-pid 15 --proxy-core-count 8 --exec --exec-appnum 0 --exec-proc-count 8 --exec-local-env 0 --exec-wdir /home/acsl/jevans/pace2-0test/mpich2-1.3.2/nas/ep --exec-args 1 ./pace [mpiexec@hpn01] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 3: --version 1.3 --interface-env-name MPICH_INTERFACE_HOSTNAME --hostname hpn04 --global-core-count 30 --global-process-count 30 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_22216_0 --pmi-process-mapping (vector,(0,1,7),(1,2,8),(2,1,7)) --local-binding user:0,1,2,3,4,5,6 --ckpoint-num -1 --global-inherited-env 53 'MODULE_VERSION_STACK=3.2.6' 'MANPATH=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/man:/opt/compilers/gcc-4.4.0/man:/usr/man' 'G77=gfortran' 'HOSTNAME=hpn01' 'SHELL=/bin/sh' 'HISTSIZE=1000' 'PBS_JOBNAME=pacetest' 'PBS_ENVIRONMENT=PBS_BATCH' 'OLDPWD=/home/acsl/jevans' 'PBS_O_WORKDIR=/home/acsl/jevans/pace2-0test/mpich2-1.3.2/nas/ep' 'USER=jevans' 'PBS_TASKNUM=1' 'LD_LIBRARY_PATH=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/lib:/opt/compilers/gcc-4.4.0/lib:/opt/compilers/gcc-4.4.0/lib64:/opt/compilers/gmp-4.3.1/lib:/opt/compilers/mpfr-2.4.1/lib' 'LS_COLORS=' 'PBS_O_HOME=/home/acsl/jevans' 'PBS_MOMPORT=15003' 'PBS_O_QUEUE=acslq' 'MODULE_VERSION=3.2.6' 'PATH=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/bin:/opt/compilers/gcc-4.4.0/bin:/usr/kerberos/bin:/bin:/usr/bin' 'PBS_O_LOGNAME=jevans' 'MAIL=/var/spool/mail/jevans' 'PBS_O_LANG=en_US.UTF-8' 'PBS_JOBCOOKIE=3BEBCAEDC95513D1710BDC2CE19CC088' 'F90=mpif90' 'PWD=/home/acsl/jevans/pace2-0test/mpich2-1.3.2/nas/ep' 'INPUTRC=/etc/inputrc' '_LMFILES_=/opt/admintools/Modules/3.2.6/modulefiles/gcc/4.4.0:/opt/admintools/Modules/3.2.6/modulefiles/mpich2-1.3.2/64/nemesis-gcc-4.4.0/4.4.0' 'LANG=en_US.UTF-8' 'PBS_NODENUM=0' 'MODULEPATH=/opt/admintools/Modules/versions:/opt/admintools/Modules/$MODULE_VERSION/modulefiles:/opt/admintools/Modules/modulefiles:' 'LOADEDMODULES=gcc/4.4.0:mpich2-1.3.2/64/nemesis-gcc-4.4.0/4.4.0' 'F77=mpif77' 'PBS_O_SHELL=/bin/bash' 'PBS_JOBID=3773.acslhpc.acsl' 'CXX=mpicxx' 'ENVIRONMENT=BATCH' 'HOME=/home/acsl/jevans' 'SHLVL=2' 'PBS_O_HOST=acslhpc.acsl' 'GCC=/opt/compilers/gcc-4.4.0/bin/gcc' 'FC=mpif77' 'PBS_VNODENUM=0' 'LOGNAME=jevans' 'PBS_QUEUE=acslq' 'MPI_HOME=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0' 'MODULESHOME=/opt/admintools/Modules/3.2.6' 'PBS_O_MAIL=/var/spool/mail/jevans' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'CC=mpicc' 'PBS_NODEFILE=/var/spool/torque/aux//3773.acslhpc.acsl' 'G_BROKEN_FILENAMES=1' 'PBS_O_PATH=/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/opt/c3-4/:/home/acsl/jevans/bin:/opt/tools/hwloc-1.0.2/bin:/opt/admintools/torque/bin:/opt/admintools/maui/bin:/opt/admintools/c3-4:' '_=/opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/bin/mpiexec' --global-user-env 0 --global-system-env 0 --start-pid 23 --proxy-core-count 7 --exec --exec-appnum 0 --exec-proc-count 7 --exec-local-env 0 --exec-wdir /home/acsl/jevans/pace2-0test/mpich2-1.3.2/nas/ep --exec-args 1 ./pace [mpiexec@hpn01] Launch arguments: /opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/bin/hydra_pmi_proxy --control-port hpn01:40335 --debug --demux poll --pgid 0 --enable-stdin 1 --proxy-id 0 [mpiexec@hpn01] Launch arguments: /usr/bin/ssh -x hpn02 /opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/bin/hydra_pmi_proxy --control-port hpn01:40335 --debug --demux poll --pgid 0 --enable-stdin 1 --proxy-id 1 [mpiexec@hpn01] Launch arguments: /usr/bin/ssh -x hpn03 /opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/bin/hydra_pmi_proxy --control-port hpn01:40335 --debug --demux poll --pgid 0 --enable-stdin 1 --proxy-id 2 [mpiexec@hpn01] Launch arguments: /usr/bin/ssh -x hpn04 /opt/mpi/mpich2-1.3.2/64/nemesis-gcc-4.4.0/bin/hydra_pmi_proxy --control-port hpn01:40335 --debug --demux poll --pgid 0 --enable-stdin 1 --proxy-id 3 [proxy:0:0@hpn01] got pmi command (from 9): init pmi_version=1 pmi_subversion=1 [proxy:0:0@hpn01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@hpn01] got pmi command (from 10): init pmi_version=1 pmi_subversion=1 [proxy:0:0@hpn01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@hpn01] got pmi command (from 9): get_maxes [proxy:0:0@hpn01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@hpn01] got pmi command (from 9): get_appnum [proxy:0:0@hpn01] PMI response: cmd=appnum appnum=0 [proxy:0:0@hpn01] got pmi command (from 10): get_maxes [proxy:0:0@hpn01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@hpn01] got pmi command (from 9): get_my_kvsname [proxy:0:0@hpn01] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:0@hpn01] got pmi command (from 10): get_appnum [proxy:0:0@hpn01] PMI response: cmd=appnum appnum=0 [proxy:0:0@hpn01] got pmi command (from 9): get_my_kvsname [proxy:0:0@hpn01] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:0@hpn01] got pmi command (from 10): get_my_kvsname [proxy:0:0@hpn01] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:0@hpn01] got pmi command (from 9): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:0@hpn01] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:0@hpn01] got pmi command (from 10): get_my_kvsname [proxy:0:0@hpn01] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:0@hpn01] got pmi command (from 9): put kvsname=kvs_22216_0 key=sharedFilename[0] value=/dev/shm/mpich_shar_tmpadAPvK [proxy:0:0@hpn01] we don't understand this command put; forwarding upstream [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=put kvsname=kvs_22216_0 key=sharedFilename[0] value=/dev/shm/mpich_shar_tmpadAPvK [mpiexec@hpn01] PMI response to fd 6 pid 9: cmd=put_result rc=0 msg=success [proxy:0:0@hpn01] got pmi command (from 10): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:0@hpn01] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:0@hpn01] we don't understand the response put_result; forwarding downstream [proxy:0:0@hpn01] got pmi command (from 10): barrier_in [proxy:0:0@hpn01] got pmi command (from 9): barrier_in [proxy:0:0@hpn01] got pmi command (from 19): init pmi_version=1 pmi_subversion=1 [proxy:0:0@hpn01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@hpn01] got pmi command (from 12): init pmi_version=1 pmi_subversion=1 [proxy:0:0@hpn01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@hpn01] got pmi command (from 22): init pmi_version=1 pmi_subversion=1 [proxy:0:0@hpn01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@hpn01] got pmi command (from 25): init pmi_version=1 pmi_subversion=1 [proxy:0:0@hpn01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@hpn01] got pmi command (from 28): init pmi_version=1 pmi_subversion=1 [proxy:0:0@hpn01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@hpn01] got pmi command (from 12): get_maxes [proxy:0:0@hpn01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@hpn01] got pmi command (from 19): get_maxes [proxy:0:0@hpn01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@hpn01] got pmi command (from 22): get_maxes [proxy:0:0@hpn01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@hpn01] got pmi command (from 12): get_appnum [proxy:0:0@hpn01] PMI response: cmd=appnum appnum=0 [proxy:0:0@hpn01] got pmi command (from 19): get_appnum [proxy:0:0@hpn01] PMI response: cmd=appnum appnum=0 [proxy:0:0@hpn01] got pmi command (from 25): get_maxes [proxy:0:0@hpn01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@hpn01] got pmi command (from 28): get_maxes [proxy:0:0@hpn01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@hpn01] got pmi command (from 12): get_my_kvsname [proxy:0:0@hpn01] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:0@hpn01] got pmi command (from 19): get_my_kvsname [proxy:0:0@hpn01] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:0@hpn01] got pmi command (from 22): get_appnum [proxy:0:0@hpn01] PMI response: cmd=appnum appnum=0 [proxy:0:0@hpn01] got pmi command (from 25): get_appnum [proxy:0:0@hpn01] PMI response: cmd=appnum appnum=0 [proxy:0:0@hpn01] got pmi command (from 12): get_my_kvsname [proxy:0:0@hpn01] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:0@hpn01] got pmi command (from 19): get_my_kvsname [proxy:0:0@hpn01] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:0@hpn01] got pmi command (from 22): get_my_kvsname [proxy:0:0@hpn01] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:0@hpn01] got pmi command (from 28): get_appnum [proxy:0:0@hpn01] PMI response: cmd=appnum appnum=0 [proxy:0:0@hpn01] got pmi command (from 12): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:0@hpn01] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:0@hpn01] got pmi command (from 19): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:0@hpn01] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:0@hpn01] got pmi command (from 25): get_my_kvsname [proxy:0:0@hpn01] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:0@hpn01] got pmi command (from 12): barrier_in [proxy:0:0@hpn01] got pmi command (from 19): barrier_in [proxy:0:0@hpn01] got pmi command (from 22): get_my_kvsname [proxy:0:0@hpn01] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:0@hpn01] got pmi command (from 28): get_my_kvsname [proxy:0:0@hpn01] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:0@hpn01] got pmi command (from 22): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:0@hpn01] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:0@hpn01] got pmi command (from 25): get_my_kvsname [proxy:0:0@hpn01] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:0@hpn01] got pmi command (from 25): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:0@hpn01] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:0@hpn01] got pmi command (from 22): barrier_in [proxy:0:0@hpn01] got pmi command (from 28): get_my_kvsname [proxy:0:0@hpn01] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:0@hpn01] got pmi command (from 25): barrier_in [proxy:0:0@hpn01] got pmi command (from 28): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:0@hpn01] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:0@hpn01] got pmi command (from 28): barrier_in [proxy:0:0@hpn01] forwarding command (cmd=barrier_in) upstream [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=barrier_in [proxy:0:2@hpn03] got pmi command (from 8): init pmi_version=1 pmi_subversion=1 [proxy:0:2@hpn03] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:2@hpn03] got pmi command (from 5): init pmi_version=1 pmi_subversion=1 [proxy:0:2@hpn03] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:2@hpn03] got pmi command (from 6): init pmi_version=1 pmi_subversion=1 [proxy:0:2@hpn03] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:2@hpn03] got pmi command (from 11): init pmi_version=1 pmi_subversion=1 [proxy:0:2@hpn03] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:2@hpn03] got pmi command (from 14): init pmi_version=1 pmi_subversion=1 [proxy:0:2@hpn03] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:2@hpn03] got pmi command (from 17): init pmi_version=1 pmi_subversion=1 [proxy:0:2@hpn03] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:2@hpn03] got pmi command (from 6): get_maxes [proxy:0:2@hpn03] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:2@hpn03] got pmi command (from 8): get_maxes [proxy:0:2@hpn03] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:2@hpn03] got pmi command (from 11): get_maxes [proxy:0:2@hpn03] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:2@hpn03] got pmi command (from 23): init pmi_version=1 pmi_subversion=1 [proxy:0:2@hpn03] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:2@hpn03] got pmi command (from 6): get_appnum [proxy:0:2@hpn03] PMI response: cmd=appnum appnum=0 [proxy:0:2@hpn03] got pmi command (from 8): get_appnum [proxy:0:2@hpn03] PMI response: cmd=appnum appnum=0 [proxy:0:2@hpn03] got pmi command (from 11): get_appnum [proxy:0:2@hpn03] PMI response: cmd=appnum appnum=0 [proxy:0:2@hpn03] got pmi command (from 14): get_maxes [proxy:0:2@hpn03] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:2@hpn03] got pmi command (from 17): get_maxes [proxy:0:2@hpn03] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:2@hpn03] got pmi command (from 20): init pmi_version=1 pmi_subversion=1 [proxy:0:2@hpn03] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:2@hpn03] got pmi command (from 6): get_my_kvsname [proxy:0:2@hpn03] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:2@hpn03] got pmi command (from 8): get_my_kvsname [proxy:0:2@hpn03] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:2@hpn03] got pmi command (from 11): get_my_kvsname [proxy:0:2@hpn03] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:2@hpn03] got pmi command (from 14): get_appnum [proxy:0:2@hpn03] PMI response: cmd=appnum appnum=0 [proxy:0:2@hpn03] got pmi command (from 17): get_appnum [proxy:0:2@hpn03] PMI response: cmd=appnum appnum=0 [proxy:0:2@hpn03] got pmi command (from 23): get_maxes [proxy:0:2@hpn03] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:2@hpn03] got pmi command (from 6): get_my_kvsname [proxy:0:2@hpn03] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:2@hpn03] got pmi command (from 8): get_my_kvsname [proxy:0:2@hpn03] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:2@hpn03] got pmi command (from 11): get_my_kvsname [proxy:0:2@hpn03] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:2@hpn03] got pmi command (from 14): get_my_kvsname [proxy:0:2@hpn03] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:2@hpn03] got pmi command (from 17): get_my_kvsname [proxy:0:2@hpn03] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:2@hpn03] got pmi command (from 20): get_maxes [proxy:0:2@hpn03] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:2@hpn03] got pmi command (from 6): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:2@hpn03] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:2@hpn03] got pmi command (from 8): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:2@hpn03] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:2@hpn03] got pmi command (from 11): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:2@hpn03] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:2@hpn03] got pmi command (from 23): get_appnum [proxy:0:2@hpn03] PMI response: cmd=appnum appnum=0 [proxy:0:2@hpn03] got pmi command (from 6): barrier_in [proxy:0:2@hpn03] got pmi command (from 8): barrier_in [proxy:0:2@hpn03] got pmi command (from 11): barrier_in [proxy:0:2@hpn03] got pmi command (from 14): get_my_kvsname [proxy:0:2@hpn03] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:2@hpn03] got pmi command (from 17): get_my_kvsname [proxy:0:2@hpn03] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:2@hpn03] got pmi command (from 20): get_appnum [proxy:0:2@hpn03] PMI response: cmd=appnum appnum=0 [proxy:0:2@hpn03] got pmi command (from 5): get_maxes [proxy:0:2@hpn03] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:2@hpn03] got pmi command (from 14): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:2@hpn03] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:2@hpn03] got pmi command (from 17): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:2@hpn03] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:2@hpn03] got pmi command (from 23): get_my_kvsname [proxy:0:2@hpn03] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:2@hpn03] got pmi command (from 20): get_my_kvsname [proxy:0:2@hpn03] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:2@hpn03] got pmi command (from 14): barrier_in [proxy:0:2@hpn03] got pmi command (from 17): barrier_in [proxy:0:2@hpn03] got pmi command (from 23): get_my_kvsname [proxy:0:2@hpn03] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:2@hpn03] got pmi command (from 20): get_my_kvsname [proxy:0:2@hpn03] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:2@hpn03] got pmi command (from 23): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:2@hpn03] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:2@hpn03] got pmi command (from 20): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:2@hpn03] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:2@hpn03] got pmi command (from 23): barrier_in [proxy:0:2@hpn03] got pmi command (from 20): barrier_in [proxy:0:2@hpn03] got pmi command (from 5): get_appnum [proxy:0:2@hpn03] PMI response: cmd=appnum appnum=0 [proxy:0:2@hpn03] got pmi command (from 5): get_my_kvsname [proxy:0:2@hpn03] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:2@hpn03] got pmi command (from 5): get_my_kvsname [proxy:0:2@hpn03] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:2@hpn03] got pmi command (from 5): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:2@hpn03] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=put kvsname=kvs_22216_0 key=sharedFilename[15] value=/dev/shm/mpich_shar_tmpvNqnvC [mpiexec@hpn01] PMI response to fd 0 pid 5: cmd=put_result rc=0 msg=success [proxy:0:2@hpn03] got pmi command (from 5): put kvsname=kvs_22216_0 key=sharedFilename[15] value=/dev/shm/mpich_shar_tmpvNqnvC [proxy:0:2@hpn03] we don't understand this command put; forwarding upstream [proxy:0:2@hpn03] we don't understand the response put_result; forwarding downstream [proxy:0:2@hpn03] got pmi command (from 5): barrier_in [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=barrier_in [proxy:0:2@hpn03] forwarding command (cmd=barrier_in) upstream [proxy:0:1@hpn02] got pmi command (from 14): init pmi_version=1 pmi_subversion=1 [proxy:0:1@hpn02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@hpn02] got pmi command (from 14): get_maxes [proxy:0:1@hpn02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@hpn02] got pmi command (from 14): get_appnum [proxy:0:1@hpn02] PMI response: cmd=appnum appnum=0 [proxy:0:1@hpn02] got pmi command (from 14): get_my_kvsname [proxy:0:1@hpn02] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:1@hpn02] got pmi command (from 5): init pmi_version=1 pmi_subversion=1 [proxy:0:1@hpn02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@hpn02] got pmi command (from 6): init pmi_version=1 pmi_subversion=1 [proxy:0:1@hpn02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@hpn02] got pmi command (from 11): init pmi_version=1 pmi_subversion=1 [proxy:0:1@hpn02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@hpn02] got pmi command (from 6): get_maxes [proxy:0:1@hpn02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@hpn02] got pmi command (from 11): get_maxes [proxy:0:1@hpn02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@hpn02] got pmi command (from 14): get_my_kvsname [proxy:0:1@hpn02] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:1@hpn02] got pmi command (from 6): get_appnum [proxy:0:1@hpn02] PMI response: cmd=appnum appnum=0 [proxy:0:1@hpn02] got pmi command (from 11): get_appnum [proxy:0:1@hpn02] PMI response: cmd=appnum appnum=0 [proxy:0:1@hpn02] got pmi command (from 6): get_my_kvsname [proxy:0:1@hpn02] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:1@hpn02] got pmi command (from 11): get_my_kvsname [proxy:0:1@hpn02] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:1@hpn02] got pmi command (from 14): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:1@hpn02] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:1@hpn02] got pmi command (from 6): get_my_kvsname [proxy:0:1@hpn02] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:1@hpn02] got pmi command (from 11): get_my_kvsname [proxy:0:1@hpn02] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:1@hpn02] got pmi command (from 20): init pmi_version=1 pmi_subversion=1 [proxy:0:1@hpn02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@hpn02] got pmi command (from 6): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:1@hpn02] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:1@hpn02] got pmi command (from 11): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:1@hpn02] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:1@hpn02] got pmi command (from 14): barrier_in [proxy:0:1@hpn02] got pmi command (from 17): init pmi_version=1 pmi_subversion=1 [proxy:0:1@hpn02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@hpn02] got pmi command (from 6): barrier_in [proxy:0:1@hpn02] got pmi command (from 8): init pmi_version=1 pmi_subversion=1 [proxy:0:1@hpn02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@hpn02] got pmi command (from 11): barrier_in [proxy:0:1@hpn02] got pmi command (from 20): get_maxes [proxy:0:1@hpn02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@hpn02] got pmi command (from 23): init pmi_version=1 pmi_subversion=1 [proxy:0:1@hpn02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@hpn02] got pmi command (from 8): get_maxes [proxy:0:1@hpn02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@hpn02] got pmi command (from 17): get_maxes [proxy:0:1@hpn02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@hpn02] got pmi command (from 20): get_appnum [proxy:0:1@hpn02] PMI response: cmd=appnum appnum=0 [proxy:0:1@hpn02] got pmi command (from 8): get_appnum [proxy:0:1@hpn02] PMI response: cmd=appnum appnum=0 [proxy:0:1@hpn02] got pmi command (from 17): get_appnum [proxy:0:1@hpn02] PMI response: cmd=appnum appnum=0 [proxy:0:1@hpn02] got pmi command (from 23): get_maxes [proxy:0:1@hpn02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@hpn02] got pmi command (from 8): get_my_kvsname [proxy:0:1@hpn02] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:1@hpn02] got pmi command (from 17): get_my_kvsname [proxy:0:1@hpn02] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:1@hpn02] got pmi command (from 20): get_my_kvsname [proxy:0:1@hpn02] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:1@hpn02] got pmi command (from 23): get_appnum [proxy:0:1@hpn02] PMI response: cmd=appnum appnum=0 [proxy:0:1@hpn02] got pmi command (from 8): get_my_kvsname [proxy:0:1@hpn02] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:1@hpn02] got pmi command (from 17): get_my_kvsname [proxy:0:1@hpn02] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:1@hpn02] got pmi command (from 8): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:1@hpn02] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:1@hpn02] got pmi command (from 20): get_my_kvsname [proxy:0:1@hpn02] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:1@hpn02] got pmi command (from 23): get_my_kvsname [proxy:0:1@hpn02] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:1@hpn02] got pmi command (from 17): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:1@hpn02] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:1@hpn02] got pmi command (from 20): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:1@hpn02] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:1@hpn02] got pmi command (from 8): barrier_in [proxy:0:1@hpn02] got pmi command (from 23): get_my_kvsname [proxy:0:1@hpn02] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:1@hpn02] got pmi command (from 17): barrier_in [proxy:0:1@hpn02] got pmi command (from 20): barrier_in [proxy:0:1@hpn02] got pmi command (from 23): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:1@hpn02] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:1@hpn02] got pmi command (from 23): barrier_in [proxy:0:1@hpn02] got pmi command (from 5): get_maxes [proxy:0:1@hpn02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@hpn02] got pmi command (from 5): get_appnum [proxy:0:1@hpn02] PMI response: cmd=appnum appnum=0 [proxy:0:1@hpn02] got pmi command (from 5): get_my_kvsname [proxy:0:1@hpn02] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:1@hpn02] got pmi command (from 5): get_my_kvsname [proxy:0:1@hpn02] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:1@hpn02] got pmi command (from 5): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:1@hpn02] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=put kvsname=kvs_22216_0 key=sharedFilename[7] value=/dev/shm/mpich_shar_tmpdDVdOK [mpiexec@hpn01] PMI response to fd 7 pid 5: cmd=put_result rc=0 msg=success [proxy:0:1@hpn02] got pmi command (from 5): put kvsname=kvs_22216_0 key=sharedFilename[7] value=/dev/shm/mpich_shar_tmpdDVdOK [proxy:0:1@hpn02] we don't understand this command put; forwarding upstream [proxy:0:1@hpn02] we don't understand the response put_result; forwarding downstream [proxy:0:1@hpn02] got pmi command (from 5): barrier_in [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=barrier_in [proxy:0:1@hpn02] forwarding command (cmd=barrier_in) upstream [proxy:0:3@hpn04] got pmi command (from 5): init pmi_version=1 pmi_subversion=1 [proxy:0:3@hpn04] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:3@hpn04] got pmi command (from 6): init pmi_version=1 pmi_subversion=1 [proxy:0:3@hpn04] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:3@hpn04] got pmi command (from 11): init pmi_version=1 pmi_subversion=1 [proxy:0:3@hpn04] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:3@hpn04] got pmi command (from 5): get_maxes [proxy:0:3@hpn04] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:3@hpn04] got pmi command (from 6): get_maxes [proxy:0:3@hpn04] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:3@hpn04] got pmi command (from 8): init pmi_version=1 pmi_subversion=1 [proxy:0:3@hpn04] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:3@hpn04] got pmi command (from 14): init pmi_version=1 pmi_subversion=1 [proxy:0:3@hpn04] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:3@hpn04] got pmi command (from 17): init pmi_version=1 pmi_subversion=1 [proxy:0:3@hpn04] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:3@hpn04] got pmi command (from 20): init pmi_version=1 pmi_subversion=1 [proxy:0:3@hpn04] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:3@hpn04] got pmi command (from 5): get_appnum [proxy:0:3@hpn04] PMI response: cmd=appnum appnum=0 [proxy:0:3@hpn04] got pmi command (from 6): get_appnum [proxy:0:3@hpn04] PMI response: cmd=appnum appnum=0 [proxy:0:3@hpn04] got pmi command (from 8): get_maxes [proxy:0:3@hpn04] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:3@hpn04] got pmi command (from 11): get_maxes [proxy:0:3@hpn04] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:3@hpn04] got pmi command (from 14): get_maxes [proxy:0:3@hpn04] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:3@hpn04] got pmi command (from 17): get_maxes [proxy:0:3@hpn04] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:3@hpn04] got pmi command (from 5): get_my_kvsname [proxy:0:3@hpn04] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:3@hpn04] got pmi command (from 6): get_my_kvsname [proxy:0:3@hpn04] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:3@hpn04] got pmi command (from 8): get_appnum [proxy:0:3@hpn04] PMI response: cmd=appnum appnum=0 [proxy:0:3@hpn04] got pmi command (from 11): get_appnum [proxy:0:3@hpn04] PMI response: cmd=appnum appnum=0 [proxy:0:3@hpn04] got pmi command (from 14): get_appnum [proxy:0:3@hpn04] PMI response: cmd=appnum appnum=0 [proxy:0:3@hpn04] got pmi command (from 20): get_maxes [proxy:0:3@hpn04] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:3@hpn04] got pmi command (from 5): get_my_kvsname [proxy:0:3@hpn04] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:3@hpn04] got pmi command (from 6): get_my_kvsname [proxy:0:3@hpn04] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:3@hpn04] got pmi command (from 8): get_my_kvsname [proxy:0:3@hpn04] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:3@hpn04] got pmi command (from 11): get_my_kvsname [proxy:0:3@hpn04] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:3@hpn04] got pmi command (from 14): get_my_kvsname [proxy:0:3@hpn04] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:3@hpn04] got pmi command (from 17): get_appnum [proxy:0:3@hpn04] PMI response: cmd=appnum appnum=0 [proxy:0:3@hpn04] got pmi command (from 5): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:3@hpn04] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:3@hpn04] got pmi command (from 6): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:3@hpn04] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:3@hpn04] got pmi command (from 8): get_my_kvsname [proxy:0:3@hpn04] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:3@hpn04] got pmi command (from 11): get_my_kvsname [proxy:0:3@hpn04] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:3@hpn04] got pmi command (from 14): get_my_kvsname [proxy:0:3@hpn04] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:3@hpn04] got pmi command (from 20): get_appnum [proxy:0:3@hpn04] PMI response: cmd=appnum appnum=0 [proxy:0:3@hpn04] got pmi command (from 5): barrier_in [proxy:0:3@hpn04] got pmi command (from 6): barrier_in [proxy:0:3@hpn04] got pmi command (from 8): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:3@hpn04] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:3@hpn04] got pmi command (from 11): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:3@hpn04] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:3@hpn04] got pmi command (from 14): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:3@hpn04] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:3@hpn04] got pmi command (from 17): get_my_kvsname [proxy:0:3@hpn04] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:3@hpn04] got pmi command (from 8): barrier_in [proxy:0:3@hpn04] got pmi command (from 11): barrier_in [proxy:0:3@hpn04] got pmi command (from 14): barrier_in [proxy:0:3@hpn04] got pmi command (from 20): get_my_kvsname [proxy:0:3@hpn04] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:3@hpn04] got pmi command (from 17): get_my_kvsname [proxy:0:3@hpn04] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:3@hpn04] got pmi command (from 20): get_my_kvsname [proxy:0:3@hpn04] PMI response: cmd=my_kvsname kvsname=kvs_22216_0 [proxy:0:3@hpn04] got pmi command (from 17): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:3@hpn04] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:3@hpn04] got pmi command (from 20): get kvsname=kvs_22216_0 key=PMI_process_mapping [proxy:0:3@hpn04] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,7),(1,2,8),(2,1,7)) [proxy:0:3@hpn04] got pmi command (from 17): barrier_in [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@hpn01] PMI response to fd 6 pid 20: cmd=barrier_out [mpiexec@hpn01] PMI response to fd 7 pid 20: cmd=barrier_out [mpiexec@hpn01] PMI response to fd 0 pid 20: cmd=barrier_out [mpiexec@hpn01] PMI response to fd 9 pid 20: cmd=barrier_out [proxy:0:0@hpn01] PMI response: cmd=barrier_out [proxy:0:0@hpn01] PMI response: cmd=barrier_out [proxy:0:0@hpn01] PMI response: cmd=barrier_out [proxy:0:0@hpn01] PMI response: cmd=barrier_out [proxy:0:0@hpn01] PMI response: cmd=barrier_out [proxy:0:0@hpn01] PMI response: cmd=barrier_out [proxy:0:0@hpn01] PMI response: cmd=barrier_out [proxy:0:0@hpn01] got pmi command (from 10): get kvsname=kvs_22216_0 key=sharedFilename[0] [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_22216_0 key=sharedFilename[0] [mpiexec@hpn01] PMI response to fd 6 pid 10: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpadAPvK [proxy:0:0@hpn01] forwarding command (cmd=get kvsname=kvs_22216_0 key=sharedFilename[0]) upstream [proxy:0:0@hpn01] got pmi command (from 12): get kvsname=kvs_22216_0 key=sharedFilename[0] [proxy:0:0@hpn01] forwarding command (cmd=get kvsname=kvs_22216_0 key=sharedFilename[0]) upstream [proxy:0:0@hpn01] got pmi command (from 19): get kvsname=kvs_22216_0 key=sharedFilename[0] [proxy:0:0@hpn01] forwarding command (cmd=get kvsname=kvs_22216_0 key=sharedFilename[0]) upstream [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_22216_0 key=sharedFilename[0] [mpiexec@hpn01] PMI response to fd 6 pid 12: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpadAPvK [proxy:0:0@hpn01] we don't understand the response get_result; forwarding downstream [proxy:0:0@hpn01] got pmi command (from 22): get kvsname=kvs_22216_0 key=sharedFilename[0] [proxy:0:0@hpn01] forwarding command (cmd=get kvsname=kvs_22216_0 key=sharedFilename[0]) upstream [proxy:0:0@hpn01] got pmi command (from 25): get kvsname=kvs_22216_0 key=sharedFilename[0] [proxy:0:0@hpn01] forwarding command (cmd=get kvsname=kvs_22216_0 key=sharedFilename[0]) upstream [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_22216_0 key=sharedFilename[0] [mpiexec@hpn01] PMI response to fd 6 pid 19: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpadAPvK [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_22216_0 key=sharedFilename[15] [mpiexec@hpn01] PMI response to fd 9 pid 6: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpvNqnvC [proxy:0:0@hpn01] got pmi command (from 28): get kvsname=kvs_22216_0 key=sharedFilename[0] [proxy:0:0@hpn01] forwarding command (cmd=get kvsname=kvs_22216_0 key=sharedFilename[0]) upstream [proxy:0:0@hpn01] we don't understand the response get_result; forwarding downstream [proxy:0:0@hpn01] we don't understand the response get_result; forwarding downstream [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_22216_0 key=sharedFilename[0] [mpiexec@hpn01] PMI response to fd 6 pid 22: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpadAPvK [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_22216_0 key=sharedFilename[15] [mpiexec@hpn01] PMI response to fd 9 pid 8: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpvNqnvC [proxy:0:0@hpn01] we don't understand the response get_result; forwarding downstream [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_22216_0 key=sharedFilename[0] [mpiexec@hpn01] PMI response to fd 6 pid 25: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpadAPvK [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_22216_0 key=sharedFilename[15] [mpiexec@hpn01] PMI response to fd 9 pid 11: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpvNqnvC [proxy:0:0@hpn01] we don't understand the response get_result; forwarding downstream [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_22216_0 key=sharedFilename[0] [mpiexec@hpn01] PMI response to fd 6 pid 28: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpadAPvK [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_22216_0 key=sharedFilename[15] [mpiexec@hpn01] PMI response to fd 9 pid 14: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpvNqnvC [proxy:0:0@hpn01] we don't understand the response get_result; forwarding downstream [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_22216_0 key=sharedFilename[15] [mpiexec@hpn01] PMI response to fd 9 pid 17: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpvNqnvC [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_22216_0 key=sharedFilename[15] [mpiexec@hpn01] PMI response to fd 9 pid 20: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpvNqnvC [proxy:0:2@hpn03] PMI response: cmd=barrier_out [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_22216_0 key=sharedFilename[15] [mpiexec@hpn01] PMI response to fd 0 pid 6: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpvNqnvC [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_22216_0 key=sharedFilename[7] [mpiexec@hpn01] PMI response to fd 7 pid 6: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpdDVdOK [proxy:0:1@hpn02] PMI response: cmd=barrier_out [proxy:0:2@hpn03] PMI response: cmd=barrier_out [proxy:0:2@hpn03] PMI response: cmd=barrier_out [proxy:0:2@hpn03] PMI response: cmd=barrier_out [proxy:0:2@hpn03] PMI response: cmd=barrier_out [proxy:0:2@hpn03] PMI response: cmd=barrier_out [proxy:0:2@hpn03] PMI response: cmd=barrier_out [proxy:0:2@hpn03] PMI response: cmd=barrier_out [proxy:0:2@hpn03] got pmi command (from 6): get kvsname=kvs_22216_0 key=sharedFilename[15] [mpiexec@hpn01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_22216_0 key=sharedFilename[15] [mpiexec@hpn01] PMI response to fd 9 pid 5: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpvNqnvC [proxy:0:1@hpn02] PMI response: cmd=barrier_out [proxy:0:1@hpn02] PMI response: cmd=barrier_out [proxy:0:1@hpn02] PMI response: cmd=barrier_out [proxy:0:1@hpn02] PMI response: cmd=barrier_out [proxy:0:1@hpn02] PMI response: cmd=barrier_out [proxy:0:1@hpn02] PMI response: cmd=barrier_out [proxy:0:1@hpn02] PMI response: cmd=barrier_out [proxy:0:1@hpn02] got pmi command (from 6): get kvsname=kvs_22216_0 key=sharedFilename[7] [proxy:0:1@hpn02] forwarding command (cmd=get kvsname=kvs_22216_0 key=sharedFilename[7]) upstream [proxy:0:3@hpn04] got pmi command (from 20): barrier_in [proxy:0:3@hpn04] forwarding command (cmd=barrier_in) upstream [proxy:0:3@hpn04] PMI response: cmd=barrier_out [proxy:0:3@hpn04] PMI response: cmd=barrier_out [proxy:0:3@hpn04] PMI response: cmd=barrier_out [proxy:0:3@hpn04] PMI response: cmd=barrier_out [proxy:0:3@hpn04] PMI response: cmd=barrier_out [proxy:0:3@hpn04] PMI response: cmd=barrier_out [proxy:0:3@hpn04] PMI response: cmd=barrier_out [proxy:0:3@hpn04] got pmi command (from 6): get kvsname=kvs_22216_0 key=sharedFilename[15] [proxy:0:3@hpn04] forwarding command (cmd=get kvsname=kvs_22216_0 key=sharedFilename[15]) upstream [proxy:0:3@hpn04] got pmi command (from 8): get kvsname=kvs_22216_0 key=sharedFilename[15] [proxy:0:3@hpn04] forwarding command (cmd=get kvsname=kvs_22216_0 key=sharedFilename[15]) upstream [proxy:0:3@hpn04] got pmi command (from 11): get kvsname=kvs_22216_0 key=sharedFilename[15] [proxy:0:3@hpn04] forwarding command (cmd=get kvsname=kvs_22216_0 key=sharedFilename[15]) upstream [proxy:0:3@hpn04] got pmi command (from 14): get kvsname=kvs_22216_0 key=sharedFilename[15] [proxy:0:3@hpn04] forwarding command (cmd=get kvsname=kvs_22216_0 key=sharedFilename[15]) upstream [proxy:0:3@hpn04] got pmi command (from 17): get kvsname=kvs_22216_0 key=sharedFilename[15] [proxy:0:3@hpn04] forwarding command (cmd=get kvsname=kvs_22216_0 key=sharedFilename[15]) upstream [proxy:0:3@hpn04] got pmi command (from 20): get kvsname=kvs_22216_0 key=sharedFilename[15] [proxy:0:3@hpn04] forwarding command (cmd=get kvsname=kvs_22216_0 key=sharedFilename[15]) upstream [proxy:0:3@hpn04] we don't understand the response get_result; forwarding downstream [proxy:0:3@hpn04] we don't understand the response get_result; forwarding downstream [proxy:0:3@hpn04] got pmi command (from 5): get kvsname=kvs_22216_0 key=sharedFilename[15] [proxy:0:3@hpn04] forwarding command (cmd=get kvsname=kvs_22216_0 key=sharedFilename[15]) upstream [proxy:0:1@hpn02] got pmi command (from 8): get kvsname=kvs_22216_0 key=sharedFilename[7] [proxy:0:1@hpn02] forwarding command (cmd=get kvsname=kvs_22216_0 key=sharedFilename[7]) upstream [proxy:0:1@hpn02] send_cmd_upstream (/scratch/program_tarballs/mpich/mpich2-1.3/src/pm/hydra/pm/pmiserv/pmip_pmi_v1.c:56): assert (!closed) failed [proxy:0:1@hpn02] fn_get (/scratch/program_tarballs/mpich/mpich2-1.3/src/pm/hydra/pm/pmiserv/pmip_pmi_v1.c:364): error sending command upstream [proxy:0:1@hpn02] pmi_cb (/scratch/program_tarballs/mpich/mpich2-1.3/src/pm/hydra/pm/pmiserv/pmip_cb.c:331): PMI handler returned error [proxy:0:1@hpn02] HYDT_dmxu_poll_wait_for_event (/scratch/program_tarballs/mpich/mpich2-1.3/src/pm/hydra/tools/demux/demux_poll.c:76): callback returned error status [proxy:0:1@hpn02] main (/scratch/program_tarballs/mpich/mpich2-1.3/src/pm/hydra/pm/pmiserv/pmip.c:221): demux engine error waiting for event [proxy:0:2@hpn03] forwarding command (cmd=get kvsname=kvs_22216_0 key=sharedFilename[15]) upstream [proxy:0:2@hpn03] got pmi command (from 8): get kvsname=kvs_22216_0 key=sharedFilename[15] [proxy:0:2@hpn03] forwarding command (cmd=get kvsname=kvs_22216_0 key=sharedFilename[15]) upstream [proxy:0:2@hpn03] send_cmd_upstream (/scratch/program_tarballs/mpich/mpich2-1.3/src/pm/hydra/pm/pmiserv/pmip_pmi_v1.c:56): assert (!closed) failed [proxy:0:2@hpn03] fn_get (/scratch/program_tarballs/mpich/mpich2-1.3/src/pm/hydra/pm/pmiserv/pmip_pmi_v1.c:364): error sending command upstream [proxy:0:2@hpn03] pmi_cb (/scratch/program_tarballs/mpich/mpich2-1.3/src/pm/hydra/pm/pmiserv/pmip_cb.c:331): PMI handler returned error [proxy:0:2@hpn03] HYDT_dmxu_poll_wait_for_event (/scratch/program_tarballs/mpich/mpich2-1.3/src/pm/hydra/tools/demux/demux_poll.c:76): callback returned error status [proxy:0:2@hpn03] main (/scratch/program_tarballs/mpich/mpich2-1.3/src/pm/hydra/pm/pmiserv/pmip.c:221): demux engine error waiting for event [proxy:0:3@hpn04] we don't understand the response get_result; forwarding downstream [proxy:0:3@hpn04] we don't understand the response get_result; forwarding downstream [proxy:0:3@hpn04] we don't understand the response get_result; forwarding downstream [proxy:0:3@hpn04] we don't understand the response get_result; forwarding downstream [proxy:0:3@hpn04] we don't understand the response get_result; forwarding downstream Fatal error in MPI_Init: Other MPI error, error stack: MPIR_Init_thread(385).................: MPID_Init(135)........................: channel initialization failed MPIDI_CH3_Init(38)....................: MPID_nem_init(196)....................: MPIDI_CH3I_Seg_commit(366)............: MPIU_SHMW_Hnd_deserialize(324)........: MPIU_SHMW_Seg_open(863)...............: MPIU_SHMW_Seg_create_attach_templ(637): open failed - No such file or directory [proxy:0:3@hpn04] HYDT_dmxu_poll_wait_for_event (/scratch/program_tarballs/mpich/mpich2-1.3/src/pm/hydra/tools/demux/demux_poll.c:70): assert (!(pollfds[i].revents & ~POLLIN & ~POLLOUT & ~POLLHUP)) failed [proxy:0:3@hpn04] main (/scratch/program_tarballs/mpich/mpich2-1.3/src/pm/hydra/pm/pmiserv/pmip.c:221): demux engine error waiting for event [mpiexec@hpn01] HYDT_bscu_wait_for_completion (/scratch/program_tarballs/mpich/mpich2-1.3/src/pm/hydra/tools/bootstrap/utils/bscu_wait.c:99): one of the processes terminated badly; aborting [mpiexec@hpn01] HYDT_bsci_wait_for_completion (/scratch/program_tarballs/mpich/mpich2-1.3/src/pm/hydra/tools/bootstrap/src/bsci_wait.c:18): bootstrap device returned error waiting for completion [mpiexec@hpn01] HYD_pmci_wait_for_completion (/scratch/program_tarballs/mpich/mpich2-1.3/src/pm/hydra/pm/pmiserv/pmiserv_pmci.c:352): bootstrap server returned error waiting for completion [mpiexec@hpn01] main (/scratch/program_tarballs/mpich/mpich2-1.3/src/pm/hydra/ui/mpich/mpiexec.c:294): process manager error waiting for completion