================================================================================================== mpiexec options: ---------------- Base path: /home/atmos/Complier/pgi/linux86-64/7.1/mpi/mpich/bin/ Launcher: (null) Debug level: 1 Enable X: -1 Global environment: ------------------- LANG=en_US.iso885915 USER=atmos LOGNAME=atmos HOME=/home/atmos PATH=/home/atmos/Utility/ncview/bin:/home/atmos/Compiler/netcdf-3.6.2/bin:/home/atmos/Compiler/ncarg-4.4.2/bin:/home/atmos/Complier/pgi/linux86-64/7.1/mpi/mpich/bin:/home/atmos/Compiler/pgi/linux86-64/7.1-1/bin:/opt/openmpi/bin:/usr/kerberos/bin:/usr/java/latest/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/opt/bio/ncbi/bin:/opt/bio/mpiblast/bin/:/opt/bio/EMBOSS/bin:/opt/bio/clustalw/bin:/opt/bio/t_coffee/bin:/opt/bio/phylip/exe:/opt/bio/mrbayes:/opt/bio/fasta:/opt/bio/glimmer/bin:/opt/bio/glimmer/scripts:/opt/bio/gmap/bin:/opt/bio/gromacs/bin:/opt/bio/autodocksuite/bin:/opt/bio/wgs/bin:/opt/eclipse:/opt/ganglia/bin:/opt/ganglia/sbin:/opt/rocks/bin:/opt/rocks/sbin:/opt/gridengine/bin/lx26-amd64 MAIL=/var/spool/mail/atmos SHELL=/bin/csh SSH_CLIENT=192.168.1.1 53550 22 SSH_CONNECTION=192.168.1.1 53550 192.168.1.105 22 SSH_TTY=/dev/pts/2 TERM=xterm DISPLAY=localhost:10.0 HOSTTYPE=x86_64-linux VENDOR=unknown OSTYPE=linux MACHTYPE=x86_64 SHLVL=1 PWD=/home/atmos/Model/WRFV3.3/WRFV3/run GROUP=atmos HOST=atmos.cmu REMOTEHOST=192.168.1.1 HOSTNAME=atmos.cmu INPUTRC=/etc/inputrc ANT_HOME=/opt/rocks BIOROLL=/opt/bio BLASTDB=/home/atmos/bio/ncbi/db BLASTMAT=/opt/bio/ncbi/data HMMER_DB=/home/atmos/bio/hmmer/db LS_COLORS=no ECLIPSE_HOME=/opt/eclipse G_BROKEN_FILENAMES=1 SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass JAVA_HOME=/usr/java/latest LESSOPEN=|/usr/bin/lesspipe.sh %s LD_LIBRARY_PATH=/home/atmos/Compiler/pgi/linux86-64/7.1-1/libso ROCKS_ROOT=/opt/rocks ROCKSROOT=/opt/rocks/share/devel ROLLSROOT=/opt/rocks/share/devel/src/roll SGE_ROOT=/opt/gridengine SGE_ARCH=lx26-amd64 SGE_CELL=default SGE_QMASTER_PORT=536 SGE_EXECD_PORT=537 MPICH_PROCESS_GROUP=no PGI=/home/atmos/Compiler/pgi PGRSH=ssh HYDRA_HOST_FILE=/home/atmos/hosts HYDRA_ENV=all MPIEXEC_TIMEOUT=120000 MPIEXEC_PORT_RANGE=40000:60000 NCARG_ROOT=/home/atmos/Compiler/ncarg-4.4.2 NCARG_BIN=/home/atmos/Compiler/ncarg-4.4.2/bin NCARG_LIB=/home/atmos/Compiler/ncarg-4.4.2/lib NCARG_INCLUDE=/home/atmos/Compiler/ncarg-4.4.2/include NCARG_MAN=/home/atmos/Compiler/ncarg-4.4.2/man NETCDF=/home/atmos/Compiler/netcdf-3.6.2 NETCDF_LIB=/home/atmos/Compiler/netcdf-3.6.2/lib NETCDF_BIN=/home/atmos/Compiler/netcdf-3.6.2/bin NETCDF_INCLUDE=/home/atmos/Compiler/netcdf-3.6.2/include NETCDF_MAN=/home/atmos/Compiler/netcdf-3.6.2/man WRF_EM_CORE=1 WRF_CHEM=1 WRF_KPP=1 WRFIO_NCD_LARGE_FILE_SUPPORT=1 CC=pgcc CFLAGS=-O2 -Msignextend -V -fPIC FFLAGS=-O2 -w -V CXX=pgCC CPPFLAGS=-DNDEBUG -DpgiFortran FLEX_LIB_DIR=/home/atmos/Compiler/flex/lib YACC=/home/atmos/Compiler/yacc/bin/yacc –d Hydra internal environment: --------------------------- GFORTRAN_UNBUFFERED_PRECONNECTED=y Proxy information: ********************* Proxy ID: 1 ----------------- Proxy name: atmos Process count: 2 Start PID: 0 Proxy exec list: .................... Exec: ./wrf.exe; Process count: 2 Proxy ID: 2 ----------------- Proxy name: compute-0-0 Process count: 2 Start PID: 2 Proxy exec list: .................... Exec: ./wrf.exe; Process count: 2 Proxy ID: 3 ----------------- Proxy name: compute-0-1 Process count: 2 Start PID: 4 Proxy exec list: .................... Exec: ./wrf.exe; Process count: 2 Proxy ID: 4 ----------------- Proxy name: compute-0-2 Process count: 2 Start PID: 6 Proxy exec list: .................... Exec: ./wrf.exe; Process count: 2 Proxy ID: 5 ----------------- Proxy name: compute-0-3 Process count: 2 Start PID: 8 Proxy exec list: .................... Exec: ./wrf.exe; Process count: 2 Proxy ID: 6 ----------------- Proxy name: compute-0-4 Process count: 2 Start PID: 10 Proxy exec list: .................... Exec: ./wrf.exe; Process count: 2 ================================================================================================== [mpiexec@atmos.cmu] Timeout set to 120000 (-1 means infinite) [mpiexec@atmos.cmu] Got a control port string of atmos:40001 Proxy launch args: /home/atmos/Complier/pgi/linux86-64/7.1/mpi/mpich/bin/hydra_pmi_proxy --control-port atmos:40001 --debug --demux poll --pgid 0 --proxy-id [mpiexec@atmos.cmu] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 0: --version 1.3.2p1 --interface-env-name MPICH_INTERFACE_HOSTNAME --hostname atmos --global-core-count 12 --global-process-count 12 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_4367_0 --pmi-process-mapping (vector,(0,6,2)) --ckpoint-num -1 --global-inherited-env 71 'LANG=en_US.iso885915' 'USER=atmos' 'LOGNAME=atmos' 'HOME=/home/atmos' 'PATH=/home/atmos/Utility/ncview/bin:/home/atmos/Compiler/netcdf-3.6.2/bin:/home/atmos/Compiler/ncarg-4.4.2/bin:/home/atmos/Complier/pgi/linux86-64/7.1/mpi/mpich/bin:/home/atmos/Compiler/pgi/linux86-64/7.1-1/bin:/opt/openmpi/bin:/usr/kerberos/bin:/usr/java/latest/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/opt/bio/ncbi/bin:/opt/bio/mpiblast/bin/:/opt/bio/EMBOSS/bin:/opt/bio/clustalw/bin:/opt/bio/t_coffee/bin:/opt/bio/phylip/exe:/opt/bio/mrbayes:/opt/bio/fasta:/opt/bio/glimmer/bin:/opt/bio/glimmer/scripts:/opt/bio/gmap/bin:/opt/bio/gromacs/bin:/opt/bio/autodocksuite/bin:/opt/bio/wgs/bin:/opt/eclipse:/opt/ganglia/bin:/opt/ganglia/sbin:/opt/rocks/bin:/opt/rocks/sbin:/opt/gridengine/bin/lx26-amd64' 'MAIL=/var/spool/mail/atmos' 'SHELL=/bin/csh' 'SSH_CLIENT=192.168.1.1 53550 22' 'SSH_CONNECTION=192.168.1.1 53550 192.168.1.105 22' 'SSH_TTY=/dev/pts/2' 'TERM=xterm' 'DISPLAY=localhost:10.0' 'HOSTTYPE=x86_64-linux' 'VENDOR=unknown' 'OSTYPE=linux' 'MACHTYPE=x86_64' 'SHLVL=1' 'PWD=/home/atmos/Model/WRFV3.3/WRFV3/run' 'GROUP=atmos' 'HOST=atmos.cmu' 'REMOTEHOST=192.168.1.1' 'HOSTNAME=atmos.cmu' 'INPUTRC=/etc/inputrc' 'ANT_HOME=/opt/rocks' 'BIOROLL=/opt/bio' 'BLASTDB=/home/atmos/bio/ncbi/db' 'BLASTMAT=/opt/bio/ncbi/data' 'HMMER_DB=/home/atmos/bio/hmmer/db' 'LS_COLORS=no' 'ECLIPSE_HOME=/opt/eclipse' 'G_BROKEN_FILENAMES=1' 'SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass' 'JAVA_HOME=/usr/java/latest' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'LD_LIBRARY_PATH=/home/atmos/Compiler/pgi/linux86-64/7.1-1/libso' 'ROCKS_ROOT=/opt/rocks' 'ROCKSROOT=/opt/rocks/share/devel' 'ROLLSROOT=/opt/rocks/share/devel/src/roll' 'SGE_ROOT=/opt/gridengine' 'SGE_ARCH=lx26-amd64' 'SGE_CELL=default' 'SGE_QMASTER_PORT=536' 'SGE_EXECD_PORT=537' 'MPICH_PROCESS_GROUP=no' 'PGI=/home/atmos/Compiler/pgi' 'PGRSH=ssh' 'HYDRA_HOST_FILE=/home/atmos/hosts' 'HYDRA_ENV=all' 'MPIEXEC_TIMEOUT=120000' 'MPIEXEC_PORT_RANGE=40000:60000' 'NCARG_ROOT=/home/atmos/Compiler/ncarg-4.4.2' 'NCARG_BIN=/home/atmos/Compiler/ncarg-4.4.2/bin' 'NCARG_LIB=/home/atmos/Compiler/ncarg-4.4.2/lib' 'NCARG_INCLUDE=/home/atmos/Compiler/ncarg-4.4.2/include' 'NCARG_MAN=/home/atmos/Compiler/ncarg-4.4.2/man' 'NETCDF=/home/atmos/Compiler/netcdf-3.6.2' 'NETCDF_LIB=/home/atmos/Compiler/netcdf-3.6.2/lib' 'NETCDF_BIN=/home/atmos/Compiler/netcdf-3.6.2/bin' 'NETCDF_INCLUDE=/home/atmos/Compiler/netcdf-3.6.2/include' 'NETCDF_MAN=/home/atmos/Compiler/netcdf-3.6.2/man' 'WRF_EM_CORE=1' 'WRF_CHEM=1' 'WRF_KPP=1' 'WRFIO_NCD_LARGE_FILE_SUPPORT=1' 'CC=pgcc' 'CFLAGS=-O2 -Msignextend -V -fPIC' 'FFLAGS=-O2 -w -V' 'CXX=pgCC' 'CPPFLAGS=-DNDEBUG -DpgiFortran' 'FLEX_LIB_DIR=/home/atmos/Compiler/flex/lib' 'YACC=/home/atmos/Compiler/yacc/bin/yacc –d' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --genv-prop all --start-pid 0 --proxy-core-count 2 --exec --exec-appnum 0 --exec-proc-count 2 --exec-local-env 0 --exec-wdir /home/atmos/Model/WRFV3.3/WRFV3/run --exec-args 1 ./wrf.exe [mpiexec@atmos.cmu] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 1: --version 1.3.2p1 --interface-env-name MPICH_INTERFACE_HOSTNAME --hostname compute-0-0 --global-core-count 12 --global-process-count 12 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_4367_0 --pmi-process-mapping (vector,(0,6,2)) --ckpoint-num -1 --global-inherited-env 71 'LANG=en_US.iso885915' 'USER=atmos' 'LOGNAME=atmos' 'HOME=/home/atmos' 'PATH=/home/atmos/Utility/ncview/bin:/home/atmos/Compiler/netcdf-3.6.2/bin:/home/atmos/Compiler/ncarg-4.4.2/bin:/home/atmos/Complier/pgi/linux86-64/7.1/mpi/mpich/bin:/home/atmos/Compiler/pgi/linux86-64/7.1-1/bin:/opt/openmpi/bin:/usr/kerberos/bin:/usr/java/latest/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/opt/bio/ncbi/bin:/opt/bio/mpiblast/bin/:/opt/bio/EMBOSS/bin:/opt/bio/clustalw/bin:/opt/bio/t_coffee/bin:/opt/bio/phylip/exe:/opt/bio/mrbayes:/opt/bio/fasta:/opt/bio/glimmer/bin:/opt/bio/glimmer/scripts:/opt/bio/gmap/bin:/opt/bio/gromacs/bin:/opt/bio/autodocksuite/bin:/opt/bio/wgs/bin:/opt/eclipse:/opt/ganglia/bin:/opt/ganglia/sbin:/opt/rocks/bin:/opt/rocks/sbin:/opt/gridengine/bin/lx26-amd64' 'MAIL=/var/spool/mail/atmos' 'SHELL=/bin/csh' 'SSH_CLIENT=192.168.1.1 53550 22' 'SSH_CONNECTION=192.168.1.1 53550 192.168.1.105 22' 'SSH_TTY=/dev/pts/2' 'TERM=xterm' 'DISPLAY=localhost:10.0' 'HOSTTYPE=x86_64-linux' 'VENDOR=unknown' 'OSTYPE=linux' 'MACHTYPE=x86_64' 'SHLVL=1' 'PWD=/home/atmos/Model/WRFV3.3/WRFV3/run' 'GROUP=atmos' 'HOST=atmos.cmu' 'REMOTEHOST=192.168.1.1' 'HOSTNAME=atmos.cmu' 'INPUTRC=/etc/inputrc' 'ANT_HOME=/opt/rocks' 'BIOROLL=/opt/bio' 'BLASTDB=/home/atmos/bio/ncbi/db' 'BLASTMAT=/opt/bio/ncbi/data' 'HMMER_DB=/home/atmos/bio/hmmer/db' 'LS_COLORS=no' 'ECLIPSE_HOME=/opt/eclipse' 'G_BROKEN_FILENAMES=1' 'SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass' 'JAVA_HOME=/usr/java/latest' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'LD_LIBRARY_PATH=/home/atmos/Compiler/pgi/linux86-64/7.1-1/libso' 'ROCKS_ROOT=/opt/rocks' 'ROCKSROOT=/opt/rocks/share/devel' 'ROLLSROOT=/opt/rocks/share/devel/src/roll' 'SGE_ROOT=/opt/gridengine' 'SGE_ARCH=lx26-amd64' 'SGE_CELL=default' 'SGE_QMASTER_PORT=536' 'SGE_EXECD_PORT=537' 'MPICH_PROCESS_GROUP=no' 'PGI=/home/atmos/Compiler/pgi' 'PGRSH=ssh' 'HYDRA_HOST_FILE=/home/atmos/hosts' 'HYDRA_ENV=all' 'MPIEXEC_TIMEOUT=120000' 'MPIEXEC_PORT_RANGE=40000:60000' 'NCARG_ROOT=/home/atmos/Compiler/ncarg-4.4.2' 'NCARG_BIN=/home/atmos/Compiler/ncarg-4.4.2/bin' 'NCARG_LIB=/home/atmos/Compiler/ncarg-4.4.2/lib' 'NCARG_INCLUDE=/home/atmos/Compiler/ncarg-4.4.2/include' 'NCARG_MAN=/home/atmos/Compiler/ncarg-4.4.2/man' 'NETCDF=/home/atmos/Compiler/netcdf-3.6.2' 'NETCDF_LIB=/home/atmos/Compiler/netcdf-3.6.2/lib' 'NETCDF_BIN=/home/atmos/Compiler/netcdf-3.6.2/bin' 'NETCDF_INCLUDE=/home/atmos/Compiler/netcdf-3.6.2/include' 'NETCDF_MAN=/home/atmos/Compiler/netcdf-3.6.2/man' 'WRF_EM_CORE=1' 'WRF_CHEM=1' 'WRF_KPP=1' 'WRFIO_NCD_LARGE_FILE_SUPPORT=1' 'CC=pgcc' 'CFLAGS=-O2 -Msignextend -V -fPIC' 'FFLAGS=-O2 -w -V' 'CXX=pgCC' 'CPPFLAGS=-DNDEBUG -DpgiFortran' 'FLEX_LIB_DIR=/home/atmos/Compiler/flex/lib' 'YACC=/home/atmos/Compiler/yacc/bin/yacc –d' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --genv-prop all --start-pid 2 --proxy-core-count 2 --exec --exec-appnum 0 --exec-proc-count 2 --exec-local-env 0 --exec-wdir /home/atmos/Model/WRFV3.3/WRFV3/run --exec-args 1 ./wrf.exe [mpiexec@atmos.cmu] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 2: --version 1.3.2p1 --interface-env-name MPICH_INTERFACE_HOSTNAME --hostname compute-0-1 --global-core-count 12 --global-process-count 12 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_4367_0 --pmi-process-mapping (vector,(0,6,2)) --ckpoint-num -1 --global-inherited-env 71 'LANG=en_US.iso885915' 'USER=atmos' 'LOGNAME=atmos' 'HOME=/home/atmos' 'PATH=/home/atmos/Utility/ncview/bin:/home/atmos/Compiler/netcdf-3.6.2/bin:/home/atmos/Compiler/ncarg-4.4.2/bin:/home/atmos/Complier/pgi/linux86-64/7.1/mpi/mpich/bin:/home/atmos/Compiler/pgi/linux86-64/7.1-1/bin:/opt/openmpi/bin:/usr/kerberos/bin:/usr/java/latest/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/opt/bio/ncbi/bin:/opt/bio/mpiblast/bin/:/opt/bio/EMBOSS/bin:/opt/bio/clustalw/bin:/opt/bio/t_coffee/bin:/opt/bio/phylip/exe:/opt/bio/mrbayes:/opt/bio/fasta:/opt/bio/glimmer/bin:/opt/bio/glimmer/scripts:/opt/bio/gmap/bin:/opt/bio/gromacs/bin:/opt/bio/autodocksuite/bin:/opt/bio/wgs/bin:/opt/eclipse:/opt/ganglia/bin:/opt/ganglia/sbin:/opt/rocks/bin:/opt/rocks/sbin:/opt/gridengine/bin/lx26-amd64' 'MAIL=/var/spool/mail/atmos' 'SHELL=/bin/csh' 'SSH_CLIENT=192.168.1.1 53550 22' 'SSH_CONNECTION=192.168.1.1 53550 192.168.1.105 22' 'SSH_TTY=/dev/pts/2' 'TERM=xterm' 'DISPLAY=localhost:10.0' 'HOSTTYPE=x86_64-linux' 'VENDOR=unknown' 'OSTYPE=linux' 'MACHTYPE=x86_64' 'SHLVL=1' 'PWD=/home/atmos/Model/WRFV3.3/WRFV3/run' 'GROUP=atmos' 'HOST=atmos.cmu' 'REMOTEHOST=192.168.1.1' 'HOSTNAME=atmos.cmu' 'INPUTRC=/etc/inputrc' 'ANT_HOME=/opt/rocks' 'BIOROLL=/opt/bio' 'BLASTDB=/home/atmos/bio/ncbi/db' 'BLASTMAT=/opt/bio/ncbi/data' 'HMMER_DB=/home/atmos/bio/hmmer/db' 'LS_COLORS=no' 'ECLIPSE_HOME=/opt/eclipse' 'G_BROKEN_FILENAMES=1' 'SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass' 'JAVA_HOME=/usr/java/latest' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'LD_LIBRARY_PATH=/home/atmos/Compiler/pgi/linux86-64/7.1-1/libso' 'ROCKS_ROOT=/opt/rocks' 'ROCKSROOT=/opt/rocks/share/devel' 'ROLLSROOT=/opt/rocks/share/devel/src/roll' 'SGE_ROOT=/opt/gridengine' 'SGE_ARCH=lx26-amd64' 'SGE_CELL=default' 'SGE_QMASTER_PORT=536' 'SGE_EXECD_PORT=537' 'MPICH_PROCESS_GROUP=no' 'PGI=/home/atmos/Compiler/pgi' 'PGRSH=ssh' 'HYDRA_HOST_FILE=/home/atmos/hosts' 'HYDRA_ENV=all' 'MPIEXEC_TIMEOUT=120000' 'MPIEXEC_PORT_RANGE=40000:60000' 'NCARG_ROOT=/home/atmos/Compiler/ncarg-4.4.2' 'NCARG_BIN=/home/atmos/Compiler/ncarg-4.4.2/bin' 'NCARG_LIB=/home/atmos/Compiler/ncarg-4.4.2/lib' 'NCARG_INCLUDE=/home/atmos/Compiler/ncarg-4.4.2/include' 'NCARG_MAN=/home/atmos/Compiler/ncarg-4.4.2/man' 'NETCDF=/home/atmos/Compiler/netcdf-3.6.2' 'NETCDF_LIB=/home/atmos/Compiler/netcdf-3.6.2/lib' 'NETCDF_BIN=/home/atmos/Compiler/netcdf-3.6.2/bin' 'NETCDF_INCLUDE=/home/atmos/Compiler/netcdf-3.6.2/include' 'NETCDF_MAN=/home/atmos/Compiler/netcdf-3.6.2/man' 'WRF_EM_CORE=1' 'WRF_CHEM=1' 'WRF_KPP=1' 'WRFIO_NCD_LARGE_FILE_SUPPORT=1' 'CC=pgcc' 'CFLAGS=-O2 -Msignextend -V -fPIC' 'FFLAGS=-O2 -w -V' 'CXX=pgCC' 'CPPFLAGS=-DNDEBUG -DpgiFortran' 'FLEX_LIB_DIR=/home/atmos/Compiler/flex/lib' 'YACC=/home/atmos/Compiler/yacc/bin/yacc –d' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --genv-prop all --start-pid 4 --proxy-core-count 2 --exec --exec-appnum 0 --exec-proc-count 2 --exec-local-env 0 --exec-wdir /home/atmos/Model/WRFV3.3/WRFV3/run --exec-args 1 ./wrf.exe [mpiexec@atmos.cmu] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 3: --version 1.3.2p1 --interface-env-name MPICH_INTERFACE_HOSTNAME --hostname compute-0-2 --global-core-count 12 --global-process-count 12 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_4367_0 --pmi-process-mapping (vector,(0,6,2)) --ckpoint-num -1 --global-inherited-env 71 'LANG=en_US.iso885915' 'USER=atmos' 'LOGNAME=atmos' 'HOME=/home/atmos' 'PATH=/home/atmos/Utility/ncview/bin:/home/atmos/Compiler/netcdf-3.6.2/bin:/home/atmos/Compiler/ncarg-4.4.2/bin:/home/atmos/Complier/pgi/linux86-64/7.1/mpi/mpich/bin:/home/atmos/Compiler/pgi/linux86-64/7.1-1/bin:/opt/openmpi/bin:/usr/kerberos/bin:/usr/java/latest/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/opt/bio/ncbi/bin:/opt/bio/mpiblast/bin/:/opt/bio/EMBOSS/bin:/opt/bio/clustalw/bin:/opt/bio/t_coffee/bin:/opt/bio/phylip/exe:/opt/bio/mrbayes:/opt/bio/fasta:/opt/bio/glimmer/bin:/opt/bio/glimmer/scripts:/opt/bio/gmap/bin:/opt/bio/gromacs/bin:/opt/bio/autodocksuite/bin:/opt/bio/wgs/bin:/opt/eclipse:/opt/ganglia/bin:/opt/ganglia/sbin:/opt/rocks/bin:/opt/rocks/sbin:/opt/gridengine/bin/lx26-amd64' 'MAIL=/var/spool/mail/atmos' 'SHELL=/bin/csh' 'SSH_CLIENT=192.168.1.1 53550 22' 'SSH_CONNECTION=192.168.1.1 53550 192.168.1.105 22' 'SSH_TTY=/dev/pts/2' 'TERM=xterm' 'DISPLAY=localhost:10.0' 'HOSTTYPE=x86_64-linux' 'VENDOR=unknown' 'OSTYPE=linux' 'MACHTYPE=x86_64' 'SHLVL=1' 'PWD=/home/atmos/Model/WRFV3.3/WRFV3/run' 'GROUP=atmos' 'HOST=atmos.cmu' 'REMOTEHOST=192.168.1.1' 'HOSTNAME=atmos.cmu' 'INPUTRC=/etc/inputrc' 'ANT_HOME=/opt/rocks' 'BIOROLL=/opt/bio' 'BLASTDB=/home/atmos/bio/ncbi/db' 'BLASTMAT=/opt/bio/ncbi/data' 'HMMER_DB=/home/atmos/bio/hmmer/db' 'LS_COLORS=no' 'ECLIPSE_HOME=/opt/eclipse' 'G_BROKEN_FILENAMES=1' 'SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass' 'JAVA_HOME=/usr/java/latest' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'LD_LIBRARY_PATH=/home/atmos/Compiler/pgi/linux86-64/7.1-1/libso' 'ROCKS_ROOT=/opt/rocks' 'ROCKSROOT=/opt/rocks/share/devel' 'ROLLSROOT=/opt/rocks/share/devel/src/roll' 'SGE_ROOT=/opt/gridengine' 'SGE_ARCH=lx26-amd64' 'SGE_CELL=default' 'SGE_QMASTER_PORT=536' 'SGE_EXECD_PORT=537' 'MPICH_PROCESS_GROUP=no' 'PGI=/home/atmos/Compiler/pgi' 'PGRSH=ssh' 'HYDRA_HOST_FILE=/home/atmos/hosts' 'HYDRA_ENV=all' 'MPIEXEC_TIMEOUT=120000' 'MPIEXEC_PORT_RANGE=40000:60000' 'NCARG_ROOT=/home/atmos/Compiler/ncarg-4.4.2' 'NCARG_BIN=/home/atmos/Compiler/ncarg-4.4.2/bin' 'NCARG_LIB=/home/atmos/Compiler/ncarg-4.4.2/lib' 'NCARG_INCLUDE=/home/atmos/Compiler/ncarg-4.4.2/include' 'NCARG_MAN=/home/atmos/Compiler/ncarg-4.4.2/man' 'NETCDF=/home/atmos/Compiler/netcdf-3.6.2' 'NETCDF_LIB=/home/atmos/Compiler/netcdf-3.6.2/lib' 'NETCDF_BIN=/home/atmos/Compiler/netcdf-3.6.2/bin' 'NETCDF_INCLUDE=/home/atmos/Compiler/netcdf-3.6.2/include' 'NETCDF_MAN=/home/atmos/Compiler/netcdf-3.6.2/man' 'WRF_EM_CORE=1' 'WRF_CHEM=1' 'WRF_KPP=1' 'WRFIO_NCD_LARGE_FILE_SUPPORT=1' 'CC=pgcc' 'CFLAGS=-O2 -Msignextend -V -fPIC' 'FFLAGS=-O2 -w -V' 'CXX=pgCC' 'CPPFLAGS=-DNDEBUG -DpgiFortran' 'FLEX_LIB_DIR=/home/atmos/Compiler/flex/lib' 'YACC=/home/atmos/Compiler/yacc/bin/yacc –d' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --genv-prop all --start-pid 6 --proxy-core-count 2 --exec --exec-appnum 0 --exec-proc-count 2 --exec-local-env 0 --exec-wdir /home/atmos/Model/WRFV3.3/WRFV3/run --exec-args 1 ./wrf.exe [mpiexec@atmos.cmu] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 4: --version 1.3.2p1 --interface-env-name MPICH_INTERFACE_HOSTNAME --hostname compute-0-3 --global-core-count 12 --global-process-count 12 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_4367_0 --pmi-process-mapping (vector,(0,6,2)) --ckpoint-num -1 --global-inherited-env 71 'LANG=en_US.iso885915' 'USER=atmos' 'LOGNAME=atmos' 'HOME=/home/atmos' 'PATH=/home/atmos/Utility/ncview/bin:/home/atmos/Compiler/netcdf-3.6.2/bin:/home/atmos/Compiler/ncarg-4.4.2/bin:/home/atmos/Complier/pgi/linux86-64/7.1/mpi/mpich/bin:/home/atmos/Compiler/pgi/linux86-64/7.1-1/bin:/opt/openmpi/bin:/usr/kerberos/bin:/usr/java/latest/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/opt/bio/ncbi/bin:/opt/bio/mpiblast/bin/:/opt/bio/EMBOSS/bin:/opt/bio/clustalw/bin:/opt/bio/t_coffee/bin:/opt/bio/phylip/exe:/opt/bio/mrbayes:/opt/bio/fasta:/opt/bio/glimmer/bin:/opt/bio/glimmer/scripts:/opt/bio/gmap/bin:/opt/bio/gromacs/bin:/opt/bio/autodocksuite/bin:/opt/bio/wgs/bin:/opt/eclipse:/opt/ganglia/bin:/opt/ganglia/sbin:/opt/rocks/bin:/opt/rocks/sbin:/opt/gridengine/bin/lx26-amd64' 'MAIL=/var/spool/mail/atmos' 'SHELL=/bin/csh' 'SSH_CLIENT=192.168.1.1 53550 22' 'SSH_CONNECTION=192.168.1.1 53550 192.168.1.105 22' 'SSH_TTY=/dev/pts/2' 'TERM=xterm' 'DISPLAY=localhost:10.0' 'HOSTTYPE=x86_64-linux' 'VENDOR=unknown' 'OSTYPE=linux' 'MACHTYPE=x86_64' 'SHLVL=1' 'PWD=/home/atmos/Model/WRFV3.3/WRFV3/run' 'GROUP=atmos' 'HOST=atmos.cmu' 'REMOTEHOST=192.168.1.1' 'HOSTNAME=atmos.cmu' 'INPUTRC=/etc/inputrc' 'ANT_HOME=/opt/rocks' 'BIOROLL=/opt/bio' 'BLASTDB=/home/atmos/bio/ncbi/db' 'BLASTMAT=/opt/bio/ncbi/data' 'HMMER_DB=/home/atmos/bio/hmmer/db' 'LS_COLORS=no' 'ECLIPSE_HOME=/opt/eclipse' 'G_BROKEN_FILENAMES=1' 'SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass' 'JAVA_HOME=/usr/java/latest' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'LD_LIBRARY_PATH=/home/atmos/Compiler/pgi/linux86-64/7.1-1/libso' 'ROCKS_ROOT=/opt/rocks' 'ROCKSROOT=/opt/rocks/share/devel' 'ROLLSROOT=/opt/rocks/share/devel/src/roll' 'SGE_ROOT=/opt/gridengine' 'SGE_ARCH=lx26-amd64' 'SGE_CELL=default' 'SGE_QMASTER_PORT=536' 'SGE_EXECD_PORT=537' 'MPICH_PROCESS_GROUP=no' 'PGI=/home/atmos/Compiler/pgi' 'PGRSH=ssh' 'HYDRA_HOST_FILE=/home/atmos/hosts' 'HYDRA_ENV=all' 'MPIEXEC_TIMEOUT=120000' 'MPIEXEC_PORT_RANGE=40000:60000' 'NCARG_ROOT=/home/atmos/Compiler/ncarg-4.4.2' 'NCARG_BIN=/home/atmos/Compiler/ncarg-4.4.2/bin' 'NCARG_LIB=/home/atmos/Compiler/ncarg-4.4.2/lib' 'NCARG_INCLUDE=/home/atmos/Compiler/ncarg-4.4.2/include' 'NCARG_MAN=/home/atmos/Compiler/ncarg-4.4.2/man' 'NETCDF=/home/atmos/Compiler/netcdf-3.6.2' 'NETCDF_LIB=/home/atmos/Compiler/netcdf-3.6.2/lib' 'NETCDF_BIN=/home/atmos/Compiler/netcdf-3.6.2/bin' 'NETCDF_INCLUDE=/home/atmos/Compiler/netcdf-3.6.2/include' 'NETCDF_MAN=/home/atmos/Compiler/netcdf-3.6.2/man' 'WRF_EM_CORE=1' 'WRF_CHEM=1' 'WRF_KPP=1' 'WRFIO_NCD_LARGE_FILE_SUPPORT=1' 'CC=pgcc' 'CFLAGS=-O2 -Msignextend -V -fPIC' 'FFLAGS=-O2 -w -V' 'CXX=pgCC' 'CPPFLAGS=-DNDEBUG -DpgiFortran' 'FLEX_LIB_DIR=/home/atmos/Compiler/flex/lib' 'YACC=/home/atmos/Compiler/yacc/bin/yacc –d' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --genv-prop all --start-pid 8 --proxy-core-count 2 --exec --exec-appnum 0 --exec-proc-count 2 --exec-local-env 0 --exec-wdir /home/atmos/Model/WRFV3.3/WRFV3/run --exec-args 1 ./wrf.exe [mpiexec@atmos.cmu] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 5: --version 1.3.2p1 --interface-env-name MPICH_INTERFACE_HOSTNAME --hostname compute-0-4 --global-core-count 12 --global-process-count 12 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_4367_0 --pmi-process-mapping (vector,(0,6,2)) --ckpoint-num -1 --global-inherited-env 71 'LANG=en_US.iso885915' 'USER=atmos' 'LOGNAME=atmos' 'HOME=/home/atmos' 'PATH=/home/atmos/Utility/ncview/bin:/home/atmos/Compiler/netcdf-3.6.2/bin:/home/atmos/Compiler/ncarg-4.4.2/bin:/home/atmos/Complier/pgi/linux86-64/7.1/mpi/mpich/bin:/home/atmos/Compiler/pgi/linux86-64/7.1-1/bin:/opt/openmpi/bin:/usr/kerberos/bin:/usr/java/latest/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/opt/bio/ncbi/bin:/opt/bio/mpiblast/bin/:/opt/bio/EMBOSS/bin:/opt/bio/clustalw/bin:/opt/bio/t_coffee/bin:/opt/bio/phylip/exe:/opt/bio/mrbayes:/opt/bio/fasta:/opt/bio/glimmer/bin:/opt/bio/glimmer/scripts:/opt/bio/gmap/bin:/opt/bio/gromacs/bin:/opt/bio/autodocksuite/bin:/opt/bio/wgs/bin:/opt/eclipse:/opt/ganglia/bin:/opt/ganglia/sbin:/opt/rocks/bin:/opt/rocks/sbin:/opt/gridengine/bin/lx26-amd64' 'MAIL=/var/spool/mail/atmos' 'SHELL=/bin/csh' 'SSH_CLIENT=192.168.1.1 53550 22' 'SSH_CONNECTION=192.168.1.1 53550 192.168.1.105 22' 'SSH_TTY=/dev/pts/2' 'TERM=xterm' 'DISPLAY=localhost:10.0' 'HOSTTYPE=x86_64-linux' 'VENDOR=unknown' 'OSTYPE=linux' 'MACHTYPE=x86_64' 'SHLVL=1' 'PWD=/home/atmos/Model/WRFV3.3/WRFV3/run' 'GROUP=atmos' 'HOST=atmos.cmu' 'REMOTEHOST=192.168.1.1' 'HOSTNAME=atmos.cmu' 'INPUTRC=/etc/inputrc' 'ANT_HOME=/opt/rocks' 'BIOROLL=/opt/bio' 'BLASTDB=/home/atmos/bio/ncbi/db' 'BLASTMAT=/opt/bio/ncbi/data' 'HMMER_DB=/home/atmos/bio/hmmer/db' 'LS_COLORS=no' 'ECLIPSE_HOME=/opt/eclipse' 'G_BROKEN_FILENAMES=1' 'SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass' 'JAVA_HOME=/usr/java/latest' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'LD_LIBRARY_PATH=/home/atmos/Compiler/pgi/linux86-64/7.1-1/libso' 'ROCKS_ROOT=/opt/rocks' 'ROCKSROOT=/opt/rocks/share/devel' 'ROLLSROOT=/opt/rocks/share/devel/src/roll' 'SGE_ROOT=/opt/gridengine' 'SGE_ARCH=lx26-amd64' 'SGE_CELL=default' 'SGE_QMASTER_PORT=536' 'SGE_EXECD_PORT=537' 'MPICH_PROCESS_GROUP=no' 'PGI=/home/atmos/Compiler/pgi' 'PGRSH=ssh' 'HYDRA_HOST_FILE=/home/atmos/hosts' 'HYDRA_ENV=all' 'MPIEXEC_TIMEOUT=120000' 'MPIEXEC_PORT_RANGE=40000:60000' 'NCARG_ROOT=/home/atmos/Compiler/ncarg-4.4.2' 'NCARG_BIN=/home/atmos/Compiler/ncarg-4.4.2/bin' 'NCARG_LIB=/home/atmos/Compiler/ncarg-4.4.2/lib' 'NCARG_INCLUDE=/home/atmos/Compiler/ncarg-4.4.2/include' 'NCARG_MAN=/home/atmos/Compiler/ncarg-4.4.2/man' 'NETCDF=/home/atmos/Compiler/netcdf-3.6.2' 'NETCDF_LIB=/home/atmos/Compiler/netcdf-3.6.2/lib' 'NETCDF_BIN=/home/atmos/Compiler/netcdf-3.6.2/bin' 'NETCDF_INCLUDE=/home/atmos/Compiler/netcdf-3.6.2/include' 'NETCDF_MAN=/home/atmos/Compiler/netcdf-3.6.2/man' 'WRF_EM_CORE=1' 'WRF_CHEM=1' 'WRF_KPP=1' 'WRFIO_NCD_LARGE_FILE_SUPPORT=1' 'CC=pgcc' 'CFLAGS=-O2 -Msignextend -V -fPIC' 'FFLAGS=-O2 -w -V' 'CXX=pgCC' 'CPPFLAGS=-DNDEBUG -DpgiFortran' 'FLEX_LIB_DIR=/home/atmos/Compiler/flex/lib' 'YACC=/home/atmos/Compiler/yacc/bin/yacc –d' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --genv-prop all --start-pid 10 --proxy-core-count 2 --exec --exec-appnum 0 --exec-proc-count 2 --exec-local-env 0 --exec-wdir /home/atmos/Model/WRFV3.3/WRFV3/run --exec-args 1 ./wrf.exe [mpiexec@atmos.cmu] Launch arguments: /home/atmos/Complier/pgi/linux86-64/7.1/mpi/mpich/bin/hydra_pmi_proxy --control-port atmos:40001 --debug --demux poll --pgid 0 --proxy-id 0 [mpiexec@atmos.cmu] Launch arguments: /usr/bin/ssh -x compute-0-0 "/home/atmos/Complier/pgi/linux86-64/7.1/mpi/mpich/bin/hydra_pmi_proxy" --control-port atmos:40001 --debug --demux poll --pgid 0 --proxy-id 1 [mpiexec@atmos.cmu] Launch arguments: /usr/bin/ssh -x compute-0-1 "/home/atmos/Complier/pgi/linux86-64/7.1/mpi/mpich/bin/hydra_pmi_proxy" --control-port atmos:40001 --debug --demux poll --pgid 0 --proxy-id 2 [mpiexec@atmos.cmu] Launch arguments: /usr/bin/ssh -x compute-0-2 "/home/atmos/Complier/pgi/linux86-64/7.1/mpi/mpich/bin/hydra_pmi_proxy" --control-port atmos:40001 --debug --demux poll --pgid 0 --proxy-id 3 [mpiexec@atmos.cmu] Launch arguments: /usr/bin/ssh -x compute-0-3 "/home/atmos/Complier/pgi/linux86-64/7.1/mpi/mpich/bin/hydra_pmi_proxy" --control-port atmos:40001 --debug --demux poll --pgid 0 --proxy-id 4 [mpiexec@atmos.cmu] Launch arguments: /usr/bin/ssh -x compute-0-4 "/home/atmos/Complier/pgi/linux86-64/7.1/mpi/mpich/bin/hydra_pmi_proxy" --control-port atmos:40001 --debug --demux poll --pgid 0 --proxy-id 5 [proxy:0:0@atmos.cmu] got pmi command (from 0): init pmi_version=1 pmi_subversion=1 [proxy:0:0@atmos.cmu] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@atmos.cmu] got pmi command (from 0): get_maxes [proxy:0:0@atmos.cmu] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@atmos.cmu] got pmi command (from 0): get_appnum [proxy:0:0@atmos.cmu] PMI response: cmd=appnum appnum=0 [proxy:0:0@atmos.cmu] got pmi command (from 0): get_my_kvsname [proxy:0:0@atmos.cmu] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:0@atmos.cmu] got pmi command (from 0): get_my_kvsname [proxy:0:0@atmos.cmu] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:0@atmos.cmu] got pmi command (from 0): get kvsname=kvs_4367_0 key=PMI_process_mapping [proxy:0:0@atmos.cmu] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,6,2)) [proxy:0:0@atmos.cmu] got pmi command (from 0): put kvsname=kvs_4367_0 key=sharedFilename[0] value=/dev/shm/mpich_shar_tmpHTpsRd [proxy:0:0@atmos.cmu] we don't understand this command put; forwarding upstream [proxy:0:0@atmos.cmu] got pmi command (from 6): init pmi_version=1 pmi_subversion=1 [proxy:0:0@atmos.cmu] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@atmos.cmu] got pmi command (from 6): get_maxes [proxy:0:0@atmos.cmu] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@atmos.cmu] got pmi command (from 6): get_appnum [proxy:0:0@atmos.cmu] PMI response: cmd=appnum appnum=0 [proxy:0:0@atmos.cmu] got pmi command (from 6): get_my_kvsname [proxy:0:0@atmos.cmu] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:0@atmos.cmu] [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4367_0 key=sharedFilename[0] value=/dev/shm/mpich_shar_tmpHTpsRd [mpiexec@atmos.cmu] PMI response to fd 6 pid 0: cmd=put_result rc=0 msg=success got pmi command (from 6): get_my_kvsname [proxy:0:0@atmos.cmu] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:0@atmos.cmu] got pmi command (from 6): get kvsname=kvs_4367_0 key=PMI_process_mapping [proxy:0:0@atmos.cmu] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,6,2)) [proxy:0:0@atmos.cmu] got pmi command (from 6): barrier_in [proxy:0:0@atmos.cmu] we don't understand the response put_result; forwarding downstream [proxy:0:0@atmos.cmu] got pmi command (from 0): barrier_in [proxy:0:0@atmos.cmu] forwarding command (cmd=barrier_in) upstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=barrier_in [proxy:0:1@compute-0-0.local] got pmi command (from 4): init pmi_version=1 pmi_subversion=1 [proxy:0:1@compute-0-0.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@compute-0-0.local] got pmi command (from 5): init pmi_version=1 pmi_subversion=1 [proxy:0:1@compute-0-0.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@compute-0-0.local] got pmi command (from 4): get_maxes [proxy:0:1@compute-0-0.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@compute-0-0.local] got pmi command (from 5): get_maxes [proxy:0:1@compute-0-0.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@compute-0-0.local] got pmi command (from 4): get_appnum [proxy:0:1@compute-0-0.local] PMI response: cmd=appnum appnum=0 [proxy:0:1@compute-0-0.local] got pmi command (from 5): get_appnum [proxy:0:1@compute-0-0.local] PMI response: cmd=appnum appnum=0 [proxy:0:1@compute-0-0.local] got pmi command (from 4): get_my_kvsname [proxy:0:1@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:1@compute-0-0.local] got pmi command (from 5): get_my_kvsname [proxy:0:1@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:1@compute-0-0.local] got pmi command (from 4): get_my_kvsname [proxy:0:1@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:1@compute-0-0.local] got pmi command (from 5): get_my_kvsname [proxy:0:1@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:1@compute-0-0.local] got pmi command (from 4): get kvsname=kvs_4367_0 key=PMI_process_mapping [proxy:0:1@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,6,2)) [proxy:0:1@compute-0-0.local] got pmi command (from 5): get kvsname=kvs_4367_0 key=PMI_process_mapping [proxy:0:1@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,6,2)) [proxy:0:1@compute-0-0.local] got pmi command (from 5): barrier_in [proxy:0:1@compute-0-0.local] got pmi command (from 4): put kvsname=kvs_4367_0 key=sharedFilename[2] value=/dev/shm/mpich_shar_tmpZ9gXZT [proxy:0:1@compute-0-0.local] we don't understand this command put; forwarding upstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4367_0 key=sharedFilename[2] value=/dev/shm/mpich_shar_tmpZ9gXZT [mpiexec@atmos.cmu] PMI response to fd 7 pid 4: cmd=put_result rc=0 msg=success [proxy:0:1@compute-0-0.local] we don't understand the response put_result; forwarding downstream [proxy:0:1@compute-0-0.local] got pmi command (from 4): barrier_in [proxy:0:1@compute-0-0.local] forwarding command (cmd=barrier_in) upstream [proxy:0:2@compute-0-1.local] got pmi command (from 4): init pmi_version=1 pmi_subversion=1 [proxy:0:2@compute-0-1.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:2@compute-0-1.local] got pmi command (from 5): init pmi_version=1 pmi_subversion=1 [proxy:0:2@compute-0-1.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:2@compute-0-1.local] got pmi command (from 4): get_maxes [proxy:0:2@compute-0-1.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:2@compute-0-1.local] got pmi command (from 5): get_maxes [proxy:0:2@compute-0-1.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:2@compute-0-1.local] got pmi command (from 4): get_appnum [proxy:0:2@compute-0-1.local] PMI response: cmd=appnum appnum=0 [proxy:0:2@compute-0-1.local] got pmi command (from 5): get_appnum [proxy:0:2@compute-0-1.local] PMI response: cmd=appnum appnum=0 [proxy:0:2@compute-0-1.local] got pmi command (from 4): get_my_kvsname [proxy:0:2@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:2@compute-0-1.local] got pmi command (from 5): get_my_kvsname [proxy:0:2@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:2@compute-0-1.local] got pmi command (from 4): get_my_kvsname [proxy:0:2@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:2@compute-0-1.local] got pmi command (from 5): get_my_kvsname [proxy:0:2@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:2@compute-0-1.local] got pmi command (from 4): get kvsname=kvs_4367_0 key=PMI_process_mapping [proxy:0:2@compute-0-1.local] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,6,2)) [proxy:0:2@compute-0-1.local] got pmi command (from 5): get kvsname=kvs_4367_0 key=PMI_process_mapping [proxy:0:2@compute-0-1.local] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,6,2)) [proxy:0:2@compute-0-1.local] got pmi command (from 4): put kvsname=kvs_4367_0 key=sharedFilename[4] value=/dev/shm/mpich_shar_tmpKbrVeP [proxy:0:2@compute-0-1.local] we don't understand this command put; forwarding upstream [proxy:0:2@compute-0-1.local] got pmi command (from 5): barrier_in [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4367_0 key=sharedFilename[4] value=/dev/shm/mpich_shar_tmpKbrVeP [mpiexec@atmos.cmu] PMI response to fd 24 pid 4: cmd=put_result rc=0 msg=success [proxy:0:2@compute-0-1.local] we don't understand the response put_result; forwarding downstream [proxy:0:2@compute-0-1.local] got pmi command (from 4): barrier_in [proxy:0:2@compute-0-1.local] forwarding command (cmd=barrier_in) upstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=barrier_in [proxy:0:3@compute-0-2.local] got pmi command (from 4): init pmi_version=1 pmi_subversion=1 [proxy:0:3@compute-0-2.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:3@compute-0-2.local] got pmi command (from 4): get_maxes [proxy:0:3@compute-0-2.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:3@compute-0-2.local] got pmi command (from 4): get_appnum [proxy:0:3@compute-0-2.local] PMI response: cmd=appnum appnum=0 [proxy:0:3@compute-0-2.local] got pmi command (from 5): init pmi_version=1 pmi_subversion=1 [proxy:0:3@compute-0-2.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:3@compute-0-2.local] got pmi command (from 4): get_my_kvsname [proxy:0:3@compute-0-2.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:3@compute-0-2.local] got pmi command (from 5): get_maxes [proxy:0:3@compute-0-2.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:3@compute-0-2.local] got pmi command (from 4): get_my_kvsname [proxy:0:3@compute-0-2.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:3@compute-0-2.local] got pmi command (from 5): get_appnum [proxy:0:3@compute-0-2.local] PMI response: cmd=appnum appnum=0 [proxy:0:3@compute-0-2.local] got pmi command (from 4): get kvsname=kvs_4367_0 key=PMI_process_mapping [proxy:0:3@compute-0-2.local] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,6,2)) [proxy:0:3@compute-0-2.local] got pmi command (from 5): get_my_kvsname [proxy:0:3@compute-0-2.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:3@compute-0-2.local] got pmi command (from 4): put kvsname=kvs_4367_0 key=sharedFilename[6] value=/dev/shm/mpich_shar_tmplRntkZ [proxy:0:3@compute-0-2.local] we don't understand this command put; forwarding upstream [proxy:0:3@compute-0-2.local] got pmi command (from 5): get_my_kvsname [proxy:0:3@compute-0-2.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:3@compute-0-2.local] got pmi command (from 5): get kvsname=kvs_4367_0 key=PMI_process_mapping [proxy:0:3@compute-0-2.local] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,6,2)) [proxy:0:3@compute-0-2.local] got pmi command (from 5): barrier_in [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4367_0 key=sharedFilename[6] value=/dev/shm/mpich_shar_tmplRntkZ [mpiexec@atmos.cmu] PMI response to fd 26 pid 4: cmd=put_result rc=0 msg=success [proxy:0:3@compute-0-2.local] we don't understand the response put_result; forwarding downstream [proxy:0:3@compute-0-2.local] [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=barrier_in got pmi command (from 4): barrier_in [proxy:0:3@compute-0-2.local] forwarding command (cmd=barrier_in) upstream [proxy:0:4@compute-0-3.local] got pmi command (from 4): init pmi_version=1 pmi_subversion=1 [proxy:0:4@compute-0-3.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:4@compute-0-3.local] got pmi command (from 5): init pmi_version=1 pmi_subversion=1 [proxy:0:4@compute-0-3.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:4@compute-0-3.local] got pmi command (from 4): get_maxes [proxy:0:4@compute-0-3.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:4@compute-0-3.local] got pmi command (from 5): get_maxes [proxy:0:4@compute-0-3.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:4@compute-0-3.local] got pmi command (from 4): get_appnum [proxy:0:4@compute-0-3.local] PMI response: cmd=appnum appnum=0 [proxy:0:4@compute-0-3.local] got pmi command (from 5): get_appnum [proxy:0:4@compute-0-3.local] PMI response: cmd=appnum appnum=0 [proxy:0:4@compute-0-3.local] got pmi command (from 4): get_my_kvsname [proxy:0:4@compute-0-3.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:4@compute-0-3.local] got pmi command (from 5): get_my_kvsname [proxy:0:4@compute-0-3.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:4@compute-0-3.local] got pmi command (from 4): get_my_kvsname [proxy:0:4@compute-0-3.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:4@compute-0-3.local] got pmi command (from 5): get_my_kvsname [proxy:0:4@compute-0-3.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:4@compute-0-3.local] got pmi command (from 4): get kvsname=kvs_4367_0 key=PMI_process_mapping [proxy:0:4@compute-0-3.local] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,6,2)) [proxy:0:4@compute-0-3.local] got pmi command (from 5): get kvsname=kvs_4367_0 key=PMI_process_mapping [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4367_0 key=sharedFilename[8] value=/dev/shm/mpich_shar_tmpR6UpAr [mpiexec@atmos.cmu] PMI response to fd 27 pid 4: cmd=put_result rc=0 msg=success [proxy:0:4@compute-0-3.local] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,6,2)) [proxy:0:4@compute-0-3.local] got pmi command (from 4): put kvsname=kvs_4367_0 key=sharedFilename[8] value=/dev/shm/mpich_shar_tmpR6UpAr [proxy:0:4@compute-0-3.local] we don't understand this command put; forwarding upstream [proxy:0:4@compute-0-3.local] got pmi command (from 5): barrier_in [proxy:0:4@compute-0-3.local] we don't understand the response put_result; forwarding downstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=barrier_in [proxy:0:4@compute-0-3.local] got pmi command (from 4): barrier_in [proxy:0:4@compute-0-3.local] forwarding command (cmd=barrier_in) upstream [proxy:0:5@compute-0-4.local] got pmi command (from 4): init pmi_version=1 pmi_subversion=1 [proxy:0:5@compute-0-4.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:5@compute-0-4.local] got pmi command (from 5): init pmi_version=1 pmi_subversion=1 [proxy:0:5@compute-0-4.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:5@compute-0-4.local] got pmi command (from 4): get_maxes [proxy:0:5@compute-0-4.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:5@compute-0-4.local] got pmi command (from 5): get_maxes [proxy:0:5@compute-0-4.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:5@compute-0-4.local] got pmi command (from 4): get_appnum [proxy:0:5@compute-0-4.local] PMI response: cmd=appnum appnum=0 [proxy:0:5@compute-0-4.local] got pmi command (from 5): get_appnum [proxy:0:5@compute-0-4.local] PMI response: cmd=appnum appnum=0 [proxy:0:5@compute-0-4.local] got pmi command (from 4): get_my_kvsname [proxy:0:5@compute-0-4.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:5@compute-0-4.local] got pmi command (from 5): get_my_kvsname [proxy:0:5@compute-0-4.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:5@compute-0-4.local] got pmi command (from 4): get_my_kvsname [proxy:0:5@compute-0-4.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:5@compute-0-4.local] got pmi command (from 5): get_my_kvsname [proxy:0:5@compute-0-4.local] PMI response: cmd=my_kvsname kvsname=kvs_4367_0 [proxy:0:5@compute-0-4.local] got pmi command (from 4): get kvsname=kvs_4367_0 key=PMI_process_mapping [proxy:0:5@compute-0-4.local] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,6,2)) [proxy:0:5@compute-0-4.local] [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4367_0 key=sharedFilename[10] value=/dev/shm/mpich_shar_tmp7eEWNr [mpiexec@atmos.cmu] PMI response to fd 28 pid 4: cmd=put_result rc=0 msg=success [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@atmos.cmu] PMI response to fd 6 pid 4: cmd=barrier_out [mpiexec@atmos.cmu] PMI response to fd 7 pid 4: cmd=barrier_out [mpiexec@atmos.cmu] PMI response to fd 24 pid 4: cmd=barrier_out [mpiexec@atmos.cmu] PMI response to fd 26 pid 4: cmd=barrier_out [mpiexec@atmos.cmu] PMI response to fd 27 pid 4: cmd=barrier_out [mpiexec@atmos.cmu] PMI response to fd 28 pid 4: cmd=barrier_out [proxy:0:0@atmos.cmu] PMI response: cmd=barrier_out [proxy:0:0@atmos.cmu] PMI response: cmd=barrier_out [proxy:0:0@atmos.cmu] got pmi command (from 6): get kvsname=kvs_4367_0 key=sharedFilename[0] [proxy:0:0@atmos.cmu] forwarding command (cmd=get kvsname=kvs_4367_0 key=sharedFilename[0]) upstream [proxy:0:1@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-0.local] got pmi command (from 5): get kvsname=kvs_4367_0 key=sharedFilename[2] [proxy:0:1@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_4367_0 key=sharedFilename[2]) upstream [proxy:0:2@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:2@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:2@compute-0-1.local] got pmi command (from 5): get kvsname=kvs_4367_0 key=sharedFilename[4] [proxy:0:2@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_4367_0 key=sharedFilename[4]) upstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4367_0 key=sharedFilename[0] [mpiexec@atmos.cmu] PMI response to fd 6 pid 6: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpHTpsRd [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4367_0 key=sharedFilename[2] [mpiexec@atmos.cmu] PMI response to fd 7 pid 5: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpZ9gXZT [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4367_0 key=sharedFilename[4] [mpiexec@atmos.cmu] PMI response to fd 24 pid 5: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpKbrVeP [proxy:0:0@atmos.cmu] we don't understand the response get_result; forwarding downstream [proxy:0:3@compute-0-2.local] PMI response: cmd=barrier_out [proxy:0:3@compute-0-2.local] PMI response: cmd=barrier_out [proxy:0:3@compute-0-2.local] got pmi command (from 5): get kvsname=kvs_4367_0 key=sharedFilename[6] [proxy:0:3@compute-0-2.local] forwarding command (cmd=get kvsname=kvs_4367_0 key=sharedFilename[6]) upstream [proxy:0:4@compute-0-3.local] PMI response: cmd=barrier_out [proxy:0:4@compute-0-3.local] PMI response: cmd=barrier_out [proxy:0:4@compute-0-3.local] got pmi command (from 5): get kvsname=kvs_4367_0 key=sharedFilename[8] [proxy:0:4@compute-0-3.local] forwarding command (cmd=get kvsname=kvs_4367_0 key=sharedFilename[8]) upstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4367_0 key=sharedFilename[6] [mpiexec@atmos.cmu] PMI response to fd 26 pid 5: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmplRntkZ [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4367_0 key=sharedFilename[8] [mpiexec@atmos.cmu] PMI response to fd 27 pid 5: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpR6UpAr [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4367_0 key=sharedFilename[10] [mpiexec@atmos.cmu] PMI response to fd 28 pid 5: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmp7eEWNr [proxy:0:1@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:2@compute-0-1.local] we don't understand the response get_result; forwarding downstream [proxy:0:3@compute-0-2.local] we don't understand the response get_result; forwarding downstream [proxy:0:4@compute-0-3.local] we don't understand the response get_result; forwarding downstream got pmi command (from 5): get kvsname=kvs_4367_0 key=PMI_process_mapping [proxy:0:5@compute-0-4.local] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,6,2)) [proxy:0:5@compute-0-4.local] got pmi command (from 4): put kvsname=kvs_4367_0 key=sharedFilename[10] value=/dev/shm/mpich_shar_tmp7eEWNr [proxy:0:5@compute-0-4.local] we don't understand this command put; forwarding upstream [proxy:0:5@compute-0-4.local] got pmi command (from 5): barrier_in [proxy:0:5@compute-0-4.local] we don't understand the response put_result; forwarding downstream [proxy:0:5@compute-0-4.local] got pmi command (from 4): barrier_in [proxy:0:5@compute-0-4.local] forwarding command (cmd=barrier_in) upstream [proxy:0:5@compute-0-4.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-0.local] got pmi command (from 4): put kvsname=kvs_4367_0 key=P2-businesscard value=description#compute-0-0$port#51172$ifname#10.255.255.254$ [proxy:0:1@compute-0-0.local] we don't understand this command put; forwarding upstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4367_0 key=P2-businesscard value=description#compute-0-0$port#51172$ifname#10.255.255.254$ [mpiexec@atmos.cmu] PMI response to fd 7 pid 4: cmd=put_result rc=0 msg=success [proxy:0:1@compute-0-0.local] got pmi command (from 5): put kvsname=kvs_4367_0 key=P3-businesscard value=description#compute-0-0$port#33976$ifname#10.255.255.254$ [proxy:0:1@compute-0-0.local] we don't understand this command put; forwarding upstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4367_0 key=P3-businesscard value=description#compute-0-0$port#33976$ifname#10.255.255.254$ [mpiexec@atmos.cmu] PMI response to fd 7 pid 5: cmd=put_result rc=0 msg=success [proxy:0:1@compute-0-0.local] we don't understand the response put_result; forwarding downstream [proxy:0:1@compute-0-0.local] got pmi command (from 4): barrier_in [proxy:0:1@compute-0-0.local] we don't understand the response put_result; forwarding downstream [proxy:0:1@compute-0-0.local] got pmi command (from 5): barrier_in [proxy:0:1@compute-0-0.local] forwarding command (cmd=barrier_in) upstream [proxy:0:2@compute-0-1.local] got pmi command (from 5): put kvsname=kvs_4367_0 key=P5-businesscard value=description#compute-0-1$port#57226$ifname#10.255.255.253$ [proxy:0:2@compute-0-1.local] we don't understand this command put; forwarding upstream [proxy:0:2@compute-0-1.local] got pmi command (from 4): put kvsname=kvs_4367_0 key=P4-businesscard value=description#compute-0-1$port#54640$ifname#10.255.255.253$ [proxy:0:2@compute-0-1.local] we don't understand this command put; forwarding upstream [proxy:0:3@compute-0-2.local] got pmi command (from 5): put kvsname=kvs_4367_0 key=P7-businesscard value=description#compute-0-2$port#50515$ifname#10.255.255.252$ [proxy:0:3@compute-0-2.local] we don't understand this command put; forwarding upstream [proxy:0:3@compute-0-2.local] got pmi command (from 4): put kvsname=kvs_4367_0 key=P6-businesscard value=description#compute-0-2$port#49076$ifname#10.255.255.252$ [proxy:0:3@compute-0-2.local] we don't understand this command put; forwarding upstream [proxy:0:4@compute-0-3.local] got pmi command (from 5): put kvsname=kvs_4367_0 key=P9-businesscard value=description#compute-0-3$port#33588$ifname#10.255.255.251$ [proxy:0:4@compute-0-3.local] we don't understand this command put; forwarding upstream [proxy:0:4@compute-0-3.local] got pmi command (from 4): put kvsname=kvs_4367_0 key=P8-businesscard value=description#compute-0-3$port#34511$ifname#10.255.255.251$ [proxy:0:4@compute-0-3.local] we don't understand this command put; forwarding upstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4367_0 key=P5-businesscard value=description#compute-0-1$port#57226$ifname#10.255.255.253$ [mpiexec@atmos.cmu] PMI response to fd 24 pid 5: cmd=put_result rc=0 msg=success [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4367_0 key=P7-businesscard value=description#compute-0-2$port#50515$ifname#10.255.255.252$ [mpiexec@atmos.cmu] PMI response to fd 26 pid 5: cmd=put_result rc=0 msg=success [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4367_0 key=P9-businesscard value=description#compute-0-3$port#33588$ifname#10.255.255.251$ [mpiexec@atmos.cmu] PMI response to fd 27 pid 5: cmd=put_result rc=0 msg=success [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4367_0 key=P11-businesscard value=description#compute-0-4$port#43783$ifname#10.255.255.250$ [mpiexec@atmos.cmu] PMI response to fd 28 pid 5: cmd=put_result rc=0 msg=success [proxy:0:2@compute-0-1.local] we don't understand the response put_result; forwarding downstream [proxy:0:2@compute-0-1.local] got pmi command (from 5): barrier_in [proxy:0:3@compute-0-2.local] we don't understand the response put_result; forwarding downstream [proxy:0:3@compute-0-2.local] got pmi command (from 5): barrier_in [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4367_0 key=P4-businesscard value=description#compute-0-1$port#54640$ifname#10.255.255.253$ [mpiexec@atmos.cmu] PMI response to fd 24 pid 4: cmd=put_result rc=0 msg=success [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4367_0 key=P6-businesscard value=description#compute-0-2$port#49076$ifname#10.255.255.252$ [mpiexec@atmos.cmu] PMI response to fd 26 pid 4: cmd=put_result rc=0 msg=success [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4367_0 key=P8-businesscard value=description#compute-0-3$port#34511$ifname#10.255.255.251$ [mpiexec@atmos.cmu] PMI response to fd 27 pid 4: cmd=put_result rc=0 msg=success [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4367_0 key=P10-businesscard value=description#compute-0-4$port#46633$ifname#10.255.255.250$ [mpiexec@atmos.cmu] PMI response to fd 28 pid 4: cmd=put_result rc=0 msg=success [proxy:0:2@compute-0-1.local] we don't understand the response put_result; forwarding downstream [proxy:0:2@compute-0-1.local] got pmi command (from 4): barrier_in [proxy:0:2@compute-0-1.local] forwarding command (cmd=barrier_in) upstream [proxy:0:3@compute-0-2.local] we don't understand the response put_result; forwarding downstream [proxy:0:3@compute-0-2.local] got pmi command (from 4): barrier_in [proxy:0:3@compute-0-2.local] forwarding command (cmd=barrier_in) upstream [proxy:0:4@compute-0-3.local] we don't understand the response put_result; forwarding downstream [proxy:0:4@compute-0-3.local] got pmi command (from 5): barrier_in [proxy:0:4@compute-0-3.local] we don't understand the response put_result; forwarding downstream [proxy:0:4@compute-0-3.local] got pmi command (from 4): barrier_in [proxy:0:4@compute-0-3.local] forwarding command (cmd=barrier_in) upstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=barrier_in [proxy:0:5@compute-0-4.local] PMI response: cmd=barrier_out [proxy:0:5@compute-0-4.local] got pmi command (from 5): get kvsname=kvs_4367_0 key=sharedFilename[10] [proxy:0:5@compute-0-4.local] forwarding command (cmd=get kvsname=kvs_4367_0 key=sharedFilename[10]) upstream [proxy:0:5@compute-0-4.local] we don't understand the response get_result; forwarding downstream [proxy:0:5@compute-0-4.local] got pmi command (from 5): put [proxy:0:0@atmos.cmu] got pmi command (from 0): put kvsname=kvs_4367_0 key=P0-businesscard value=description#atmos$port#45625$ifname#10.0.1.1$ [proxy:0:0@atmos.cmu] we don't understand this command put; forwarding upstream [proxy:0:0@atmos.cmu] got pmi command (from 6): put [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4367_0 key=P0-businesscard value=description#atmos$port#45625$ifname#10.0.1.1$ [mpiexec@atmos.cmu] PMI response to fd 6 pid 0: cmd=put_result rc=0 msg=success kvsname=kvs_4367_0 key=P1-businesscard value=description#atmos$port#52270$ifname#10.0.1.1$ [proxy:0:0@atmos.cmu] we don't understand this command put; forwarding upstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4367_0 key=P1-businesscard value=description#atmos$port#52270$ifname#10.0.1.1$ [mpiexec@atmos.cmu] PMI response to fd 6 pid 6: cmd=put_result rc=0 msg=success [proxy:0:0@atmos.cmu] we don't understand the response put_result; forwarding downstream [proxy:0:0@atmos.cmu] we don't understand the response put_result; forwarding downstream [proxy:0:0@atmos.cmu] got pmi command (from 0): barrier_in [proxy:0:0@atmos.cmu] got pmi command (from 6): barrier_in [proxy:0:0@atmos.cmu] forwarding command (cmd=barrier_in) upstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@atmos.cmu] PMI response to fd 6 pid 6: cmd=barrier_out [mpiexec@atmos.cmu] PMI response to fd 7 pid 6: cmd=barrier_out [mpiexec@atmos.cmu] PMI response to fd 24 pid 6: cmd=barrier_out [mpiexec@atmos.cmu] PMI response to fd 26 pid 6: cmd=barrier_out [mpiexec@atmos.cmu] PMI response to fd 27 pid 6: cmd=barrier_out [mpiexec@atmos.cmu] PMI response to fd 28 pid 6: cmd=barrier_out [proxy:0:0@atmos.cmu] PMI response: cmd=barrier_out [proxy:0:0@atmos.cmu] PMI response: cmd=barrier_out [proxy:0:1@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:2@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:2@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:4@compute-0-3.local] PMI response: cmd=barrier_out [proxy:0:4@compute-0-3.local] PMI response: cmd=barrier_out [proxy:0:3@compute-0-2.local] PMI response: cmd=barrier_out [proxy:0:3@compute-0-2.local] PMI response: cmd=barrier_out kvsname=kvs_4367_0 key=P11-businesscard value=description#compute-0-4$port#43783$ifname#10.255.255.250$ [proxy:0:5@compute-0-4.local] we don't understand this command put; forwarding upstream [proxy:0:5@compute-0-4.local] got pmi command (from 4): put kvsname=kvs_4367_0 key=P10-businesscard value=description#compute-0-4$port#46633$ifname#10.255.255.250$ [proxy:0:5@compute-0-4.local] we don't understand this command put; forwarding upstream [proxy:0:5@compute-0-4.local] we don't understand the response put_result; forwarding downstream [proxy:0:5@compute-0-4.local] got pmi command (from 5): barrier_in [proxy:0:5@compute-0-4.local] we don't understand the response put_result; forwarding downstream [proxy:0:5@compute-0-4.local] got pmi command (from 4): barrier_in [proxy:0:5@compute-0-4.local] forwarding command (cmd=barrier_in) upstream [proxy:0:5@compute-0-4.local] PMI response: cmd=barrier_out [proxy:0:5@compute-0-4.local] PMI response: cmd=barrier_out [proxy:0:0@atmos.cmu] got pmi command (from 0): get kvsname=kvs_4367_0 key=P8-businesscard [proxy:0:0@atmos.cmu] forwarding command (cmd=get kvsname=kvs_4367_0 key=P8-businesscard) upstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4367_0 key=P8-businesscard [mpiexec@atmos.cmu] PMI response to fd 6 pid 0: cmd=get_result rc=0 msg=success value=description#compute-0-3$port#34511$ifname#10.255.255.251$ [proxy:0:0@atmos.cmu] we don't understand the response get_result; forwarding downstream [proxy:0:0@atmos.cmu] got pmi command (from 0): get kvsname=kvs_4367_0 key=P4-businesscard [proxy:0:0@atmos.cmu] forwarding command (cmd=get kvsname=kvs_4367_0 key=P4-businesscard) upstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4367_0 key=P4-businesscard [mpiexec@atmos.cmu] PMI response to fd 6 pid 0: cmd=get_result rc=0 msg=success value=description#compute-0-1$port#54640$ifname#10.255.255.253$ [proxy:0:0@atmos.cmu] we don't understand the response get_result; forwarding downstream [proxy:0:4@compute-0-3.local] got pmi command (from 4): get kvsname=kvs_4367_0 key=P10-businesscard [proxy:0:4@compute-0-3.local] forwarding command (cmd=get kvsname=kvs_4367_0 key=P10-businesscard) upstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4367_0 key=P10-businesscard [mpiexec@atmos.cmu] PMI response to fd 27 pid 4: cmd=get_result rc=0 msg=success value=description#compute-0-4$port#46633$ifname#10.255.255.250$ [proxy:0:0@atmos.cmu] got pmi command (from 0): get kvsname=kvs_4367_0 key=P2-businesscard [proxy:0:0@atmos.cmu] forwarding command (cmd=get kvsname=kvs_4367_0 key=P2-businesscard) upstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4367_0 key=P2-businesscard [mpiexec@atmos.cmu] PMI response to fd 6 pid 0: cmd=get_result rc=0 msg=success value=description#compute-0-0$port#51172$ifname#10.255.255.254$ [proxy:0:0@atmos.cmu] we don't understand the response get_result; forwarding downstream [proxy:0:2@compute-0-1.local] got pmi command (from 4): get kvsname=kvs_4367_0 key=P6-businesscard [proxy:0:2@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_4367_0 key=P6-businesscard) upstream [proxy:0:4@compute-0-3.local] we don't understand the response get_result; forwarding downstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4367_0 key=P6-businesscard [mpiexec@atmos.cmu] PMI response to fd 24 pid 4: cmd=get_result rc=0 msg=success value=description#compute-0-2$port#49076$ifname#10.255.255.252$ [proxy:0:2@compute-0-1.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@atmos.cmu] got pmi command (from 6): get kvsname=kvs_4367_0 key=P2-businesscard [proxy:0:0@atmos.cmu] forwarding command (cmd=get kvsname=kvs_4367_0 key=P2-businesscard) upstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4367_0 key=P2-businesscard [mpiexec@atmos.cmu] PMI response to fd 6 pid 6: cmd=get_result rc=0 msg=success value=description#compute-0-0$port#51172$ifname#10.255.255.254$ [proxy:0:0@atmos.cmu] we don't understand the response get_result; forwarding downstream [proxy:0:1@compute-0-0.local] got pmi command (from 5): get kvsname=kvs_4367_0 key=P4-businesscard [proxy:0:1@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_4367_0 key=P4-businesscard) upstream [proxy:0:2@compute-0-1.local] got pmi command (from 5): get kvsname=kvs_4367_0 key=P6-businesscard [proxy:0:2@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_4367_0 key=P6-businesscard) upstream [proxy:0:4@compute-0-3.local] got pmi command (from 5): get kvsname=kvs_4367_0 key=P10-businesscard [proxy:0:4@compute-0-3.local] forwarding command (cmd=get kvsname=kvs_4367_0 key=P10-businesscard) upstream [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4367_0 key=P4-businesscard [mpiexec@atmos.cmu] PMI response to fd 7 pid 5: cmd=get_result rc=0 msg=success value=description#compute-0-1$port#54640$ifname#10.255.255.253$ [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4367_0 key=P6-businesscard [mpiexec@atmos.cmu] PMI response to fd 24 pid 5: cmd=get_result rc=0 msg=success value=description#compute-0-2$port#49076$ifname#10.255.255.252$ [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4367_0 key=P10-businesscard [mpiexec@atmos.cmu] PMI response to fd 27 pid 5: cmd=get_result rc=0 msg=success value=description#compute-0-4$port#46633$ifname#10.255.255.250$ [mpiexec@atmos.cmu] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4367_0 key=P0-businesscard [mpiexec@atmos.cmu] PMI response to fd 28 pid 5: cmd=get_result rc=0 msg=success value=description#atmos$port#45625$ifname#10.0.1.1$ [proxy:0:1@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:2@compute-0-1.local] we don't understand the response get_result; forwarding downstream [proxy:0:4@compute-0-3.local] we don't understand the response get_result; forwarding downstream [proxy:0:5@compute-0-4.local] got pmi command (from 5): get kvsname=kvs_4367_0 key=P0-businesscard [proxy:0:5@compute-0-4.local] forwarding command (cmd=get kvsname=kvs_4367_0 key=P0-businesscard) upstream [proxy:0:5@compute-0-4.local] we don't understand the response get_result; forwarding downstream [mpiexec@atmos.cmu] ONE OF THE PROCESSES TERMINATED BADLY: CLEANING UP