================================================================================================== mpiexec options: ---------------- Base path: /m/raid2/thejna/mpich3_nemesis/bin/ Bootstrap server: (null) Debug level: 1 Enable X: -1 Global environment: ------------------- HOSTNAME=k1 TERM=xterm SHELL=/bin/bash HISTSIZE=1000 KDE_NO_IPV6=1 SSH_CLIENT=134.102.241.45 54184 22 SSH_TTY=/dev/pts/2 MPICH_SOCKET_BUFFER_SIZE=131072 USER=thejna LD_LIBRARY_PATH=/m/raid2/thejna/mpich3_nemesis/lib/trace_rlog: LS_COLORS=no HYDRA_HOST_FILE=/m/raid2/thejna/hosts_hydra KDEDIR=/usr PGI=/llocal1/pgi/725/ MAIL=/var/spool/mail/thejna PATH=/llocal1/SL64/netcdf-4.0/bin/:/m/raid2/thejna/mpich3_nemesis/bin:/llocal1/pgi/725/linux86-64/7.2-5/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin INPUTRC=/etc/inputrc PWD=/m/raid2/thejna/mpich3_nemesis/examples LANG=en_US.UTF-8 KDE_IS_PRELINKED=1 MODULEPATH=/usr/share/Modules/modulefiles:/etc/modulefiles LOADEDMODULES=(null) LM_LICENSE_FILE=/llocal1/pgi/725/license.dat SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass SHLVL=1 HOME=/v1/thejna LOGNAME=thejna CVS_RSH=ssh SSH_CONNECTION=134.102.241.45 54184 134.102.241.191 22 MODULESHOME=/usr/share/Modules LESSOPEN=|/usr/bin/lesspipe.sh %s G_BROKEN_FILENAMES=1 Work=/m/raid2/thejna/ modelroot=/m/raid2/thejna/ccsm3_0/ module=() { eval `/usr/bin/modulecmd bash $*` } OLDPWD=/v1/thejna _=/m/raid2/thejna/mpich3_nemesis/bin/mpiexec Proxy information: ********************* Proxy ID: 1 ----------------- Proxy name: 192.168.2.51 Process count: 1 Start PID: 0 Proxy exec list: .................... Exec: ./cpi; Process count: 1 Exec: ./cpi; Process count: 1 Exec: ./cpi; Process count: 1 Exec: ./cpi; Process count: 1 Exec: ./cpi; Process count: 1 Exec: ./cpi; Process count: 1 Exec: ./cpi; Process count: 1 ================================================================================================== [mpiexec@k1] Timeout set to -1 (-1 means infinite) [mpiexec@k1] Got a control port string of k1:49479 Proxy launch args: /m/raid2/thejna/mpich3_nemesis/bin/hydra_pmi_proxy --control-port k1:49479 --debug --demux poll --pgid 0 --enable-stdin 1 --proxy-id [mpiexec@k1] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 0: --version 1.3b1 --interface-env-name MPICH_INTERFACE_NAME --hostname 192.168.2.51 --global-core-count 1 --global-process-count 7 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_11165_0 --pmi-process-mapping (vector,(0,1,1)) --global-inherited-env 37 'HOSTNAME=k1' 'TERM=xterm' 'SHELL=/bin/bash' 'HISTSIZE=1000' 'KDE_NO_IPV6=1' 'SSH_CLIENT=134.102.241.45 54184 22' 'SSH_TTY=/dev/pts/2' 'MPICH_SOCKET_BUFFER_SIZE=131072' 'USER=thejna' 'LD_LIBRARY_PATH=/m/raid2/thejna/mpich3_nemesis/lib/trace_rlog:' 'LS_COLORS=no' 'HYDRA_HOST_FILE=/m/raid2/thejna/hosts_hydra' 'KDEDIR=/usr' 'PGI=/llocal1/pgi/725/' 'MAIL=/var/spool/mail/thejna' 'PATH=/llocal1/SL64/netcdf-4.0/bin/:/m/raid2/thejna/mpich3_nemesis/bin:/llocal1/pgi/725/linux86-64/7.2-5/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin' 'INPUTRC=/etc/inputrc' 'PWD=/m/raid2/thejna/mpich3_nemesis/examples' 'LANG=en_US.UTF-8' 'KDE_IS_PRELINKED=1' 'MODULEPATH=/usr/share/Modules/modulefiles:/etc/modulefiles' 'LOADEDMODULES=' 'LM_LICENSE_FILE=/llocal1/pgi/725/license.dat' 'SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass' 'SHLVL=1' 'HOME=/v1/thejna' 'LOGNAME=thejna' 'CVS_RSH=ssh' 'SSH_CONNECTION=134.102.241.45 54184 134.102.241.191 22' 'MODULESHOME=/usr/share/Modules' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'G_BROKEN_FILENAMES=1' 'Work=/m/raid2/thejna/' 'modelroot=/m/raid2/thejna/ccsm3_0/' 'module=() { eval `/usr/bin/modulecmd bash $*` }' 'OLDPWD=/v1/thejna' '_=/m/raid2/thejna/mpich3_nemesis/bin/mpiexec' --global-user-env 0 --global-system-env 0 --start-pid 0 --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 0 --exec-wdir /m/raid2/thejna/mpich3_nemesis/examples --exec-args 1 ./cpi --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 0 --exec-wdir /m/raid2/thejna/mpich3_nemesis/examples --exec-args 1 ./cpi --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 0 --exec-wdir /m/raid2/thejna/mpich3_nemesis/examples --exec-args 1 ./cpi --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 0 --exec-wdir /m/raid2/thejna/mpich3_nemesis/examples --exec-args 1 ./cpi --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 0 --exec-wdir /m/raid2/thejna/mpich3_nemesis/examples --exec-args 1 ./cpi --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 0 --exec-wdir /m/raid2/thejna/mpich3_nemesis/examples --exec-args 1 ./cpi --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 0 --exec-wdir /m/raid2/thejna/mpich3_nemesis/examples --exec-args 1 ./cpi [mpiexec@k1] Launch arguments: /usr/bin/ssh -x 192.168.2.51 /m/raid2/thejna/mpich3_nemesis/bin/hydra_pmi_proxy --control-port k1:49479 --debug --demux poll --pgid 0 --enable-stdin 1 --proxy-id 0 [mpiexec@k1] [pgid: 0] got PMI command: cmd=put kvsname=kvs_11165_0 key=sharedFilename[0] value=/dev/shm/mpich_shar_tmpwIUT1l [mpiexec@k1] PMI response to fd 6 pid 4: cmd=put_result rc=0 msg=success [mpiexec@k1] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@k1] PMI response to fd 6 pid 17: cmd=barrier_out [mpiexec@k1] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11165_0 key=sharedFilename[0] [mpiexec@k1] PMI response to fd 6 pid 5: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpwIUT1l [mpiexec@k1] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11165_0 key=sharedFilename[0] [mpiexec@k1] PMI response to fd 6 pid 6: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpwIUT1l [mpiexec@k1] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11165_0 key=sharedFilename[0] [mpiexec@k1] PMI response to fd 6 pid 11: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpwIUT1l [mpiexec@k1] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11165_0 key=sharedFilename[0] [mpiexec@k1] PMI response to fd 6 pid 14: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpwIUT1l [mpiexec@k1] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11165_0 key=sharedFilename[0] [mpiexec@k1] PMI response to fd 6 pid 17: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpwIUT1l [mpiexec@k1] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11165_0 key=sharedFilename[0] [mpiexec@k1] PMI response to fd 6 pid 20: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpwIUT1l [mpiexec@k1] [pgid: 0] got PMI command: cmd=put kvsname=kvs_11165_0 key=P0-businesscard value=description#k1$port#41142$ifname#134.102.241.191$ [mpiexec@k1] PMI response to fd 6 pid 4: cmd=put_result rc=0 msg=success [mpiexec@k1] [pgid: 0] got PMI command: cmd=put kvsname=kvs_11165_0 key=P2-businesscard value=description#k1$port#56498$ifname#134.102.241.191$ [mpiexec@k1] PMI response to fd 6 pid 6: cmd=put_result rc=0 msg=success [mpiexec@k1] [pgid: 0] got PMI command: cmd=put kvsname=kvs_11165_0 key=P3-businesscard value=description#k1$port#60939$ifname#134.102.241.191$ [mpiexec@k1] PMI response to fd 6 pid 11: cmd=put_result rc=0 msg=success [mpiexec@k1] [pgid: 0] got PMI command: cmd=put kvsname=kvs_11165_0 key=P1-businesscard value=description#k1$port#52810$ifname#134.102.241.191$ [mpiexec@k1] PMI response to fd 6 pid 5: cmd=put_result rc=0 msg=success [mpiexec@k1] [pgid: 0] got PMI command: cmd=put kvsname=kvs_11165_0 key=P5-businesscard value=description#k1$port#50672$ifname#134.102.241.191$ [mpiexec@k1] PMI response to fd 6 pid 17: cmd=put_result rc=0 msg=success [mpiexec@k1] [pgid: 0] got PMI command: cmd=put kvsname=kvs_11165_0 key=P6-businesscard value=description#k1$port#60357$ifname#134.102.241.191$ [mpiexec@k1] PMI response to fd 6 pid 20: cmd=put_result rc=0 msg=success [mpiexec@k1] [pgid: 0] got PMI command: cmd=put kvsname=kvs_11165_0 key=P4-businesscard value=description#k1$port#38687$ifname#134.102.241.191$ [mpiexec@k1] PMI response to fd 6 pid 14: cmd=put_result rc=0 msg=success [mpiexec@k1] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@k1] PMI response to fd 6 pid 14: cmd=barrier_out [proxy:0:0@k1] got pmi command (from 4): init pmi_version=1 pmi_subversion=1 [proxy:0:0@k1] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@k1] got pmi command (from 4): get_maxes [proxy:0:0@k1] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@k1] got pmi command (from 4): get_appnum [proxy:0:0@k1] PMI response: cmd=appnum appnum=0 [proxy:0:0@k1] got pmi command (from 4): get_my_kvsname [proxy:0:0@k1] PMI response: cmd=my_kvsname kvsname=kvs_11165_0 [proxy:0:0@k1] got pmi command (from 4): get_my_kvsname [proxy:0:0@k1] PMI response: cmd=my_kvsname kvsname=kvs_11165_0 [proxy:0:0@k1] got pmi command (from 4): get kvsname=kvs_11165_0 key=PMI_process_mapping [proxy:0:0@k1] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,1)) [proxy:0:0@k1] got pmi command (from 4): put kvsname=kvs_11165_0 key=sharedFilename[0] value=/dev/shm/mpich_shar_tmpwIUT1l [proxy:0:0@k1] we don't understand this command put; forwarding upstream [proxy:0:0@k1] we don't understand the response put_result; forwarding downstream [proxy:0:0@k1] got pmi command (from 4): barrier_in [proxy:0:0@k1] got pmi command (from 5): init pmi_version=1 pmi_subversion=1 [proxy:0:0@k1] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@k1] got pmi command (from 5): get_maxes [proxy:0:0@k1] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@k1] got pmi command (from 5): get_appnum [proxy:0:0@k1] PMI response: cmd=appnum appnum=0 [proxy:0:0@k1] got pmi command (from 5): get_my_kvsname [proxy:0:0@k1] PMI response: cmd=my_kvsname kvsname=kvs_11165_0 [proxy:0:0@k1] got pmi command (from 5): get_my_kvsname [proxy:0:0@k1] PMI response: cmd=my_kvsname kvsname=kvs_11165_0 [proxy:0:0@k1] got pmi command (from 5): get kvsname=kvs_11165_0 key=PMI_process_mapping [proxy:0:0@k1] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,1)) [proxy:0:0@k1] got pmi command (from 5): barrier_in [proxy:0:0@k1] got pmi command (from 6): init pmi_version=1 pmi_subversion=1 [proxy:0:0@k1] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@k1] got pmi command (from 6): get_maxes [proxy:0:0@k1] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@k1] got pmi command (from 6): get_appnum [proxy:0:0@k1] PMI response: cmd=appnum appnum=0 [proxy:0:0@k1] got pmi command (from 6): get_my_kvsname [proxy:0:0@k1] PMI response: cmd=my_kvsname kvsname=kvs_11165_0 [proxy:0:0@k1] got pmi command (from 6): get_my_kvsname [proxy:0:0@k1] PMI response: cmd=my_kvsname kvsname=kvs_11165_0 [proxy:0:0@k1] got pmi command (from 6): get kvsname=kvs_11165_0 key=PMI_process_mapping [proxy:0:0@k1] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,1)) [proxy:0:0@k1] got pmi command (from 6): barrier_in [proxy:0:0@k1] got pmi command (from 11): init pmi_version=1 pmi_subversion=1 [proxy:0:0@k1] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@k1] got pmi command (from 11): get_maxes [proxy:0:0@k1] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@k1] got pmi command (from 11): get_appnum [proxy:0:0@k1] PMI response: cmd=appnum appnum=0 [proxy:0:0@k1] got pmi command (from 11): get_my_kvsname [proxy:0:0@k1] PMI response: cmd=my_kvsname kvsname=kvs_11165_0 [proxy:0:0@k1] got pmi command (from 11): get_my_kvsname [proxy:0:0@k1] PMI response: cmd=my_kvsname kvsname=kvs_11165_0 [proxy:0:0@k1] got pmi command (from 20): init pmi_version=1 pmi_subversion=1 [proxy:0:0@k1] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@k1] got pmi command (from 11): get kvsname=kvs_11165_0 key=PMI_process_mapping [proxy:0:0@k1] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,1)) [proxy:0:0@k1] got pmi command (from 14): init pmi_version=1 pmi_subversion=1 [proxy:0:0@k1] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@k1] got pmi command (from 20): get_maxes [proxy:0:0@k1] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@k1] got pmi command (from 11): barrier_in [proxy:0:0@k1] got pmi command (from 14): get_maxes [proxy:0:0@k1] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@k1] got pmi command (from 20): get_appnum [proxy:0:0@k1] PMI response: cmd=appnum appnum=0 [proxy:0:0@k1] got pmi command (from 14): get_appnum [proxy:0:0@k1] PMI response: cmd=appnum appnum=0 [proxy:0:0@k1] got pmi command (from 20): get_my_kvsname [proxy:0:0@k1] PMI response: cmd=my_kvsname kvsname=kvs_11165_0 [proxy:0:0@k1] got pmi command (from 14): get_my_kvsname [proxy:0:0@k1] PMI response: cmd=my_kvsname kvsname=kvs_11165_0 [proxy:0:0@k1] got pmi command (from 20): get_my_kvsname [proxy:0:0@k1] PMI response: cmd=my_kvsname kvsname=kvs_11165_0 [proxy:0:0@k1] got pmi command (from 14): get_my_kvsname [proxy:0:0@k1] PMI response: cmd=my_kvsname kvsname=kvs_11165_0 [proxy:0:0@k1] got pmi command (from 20): get kvsname=kvs_11165_0 key=PMI_process_mapping [proxy:0:0@k1] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,1)) [proxy:0:0@k1] got pmi command (from 14): get kvsname=kvs_11165_0 key=PMI_process_mapping [proxy:0:0@k1] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,1)) [proxy:0:0@k1] got pmi command (from 20): barrier_in [proxy:0:0@k1] got pmi command (from 14): barrier_in [proxy:0:0@k1] got pmi command (from 17): init pmi_version=1 pmi_subversion=1 [proxy:0:0@k1] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@k1] got pmi command (from 17): get_maxes [proxy:0:0@k1] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@k1] got pmi command (from 17): get_appnum [proxy:0:0@k1] PMI response: cmd=appnum appnum=0 [proxy:0:0@k1] got pmi command (from 17): get_my_kvsname [proxy:0:0@k1] PMI response: cmd=my_kvsname kvsname=kvs_11165_0 [proxy:0:0@k1] got pmi command (from 17): get_my_kvsname [proxy:0:0@k1] PMI response: cmd=my_kvsname kvsname=kvs_11165_0 [proxy:0:0@k1] got pmi command (from 17): get kvsname=kvs_11165_0 key=PMI_process_mapping [proxy:0:0@k1] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,1)) [proxy:0:0@k1] got pmi command (from 17): barrier_in [proxy:0:0@k1] forwarding command (cmd=barrier_in) upstream [proxy:0:0@k1] PMI response: cmd=barrier_out [proxy:0:0@k1] PMI response: cmd=barrier_out [proxy:0:0@k1] PMI response: cmd=barrier_out [proxy:0:0@k1] PMI response: cmd=barrier_out [proxy:0:0@k1] PMI response: cmd=barrier_out [proxy:0:0@k1] PMI response: cmd=barrier_out [proxy:0:0@k1] PMI response: cmd=barrier_out [proxy:0:0@k1] got pmi command (from 5): get kvsname=kvs_11165_0 key=sharedFilename[0] [proxy:0:0@k1] forwarding command (cmd=get kvsname=kvs_11165_0 key=sharedFilename[0]) upstream [proxy:0:0@k1] got pmi command (from 6): get kvsname=kvs_11165_0 key=sharedFilename[0] [proxy:0:0@k1] forwarding command (cmd=get kvsname=kvs_11165_0 key=sharedFilename[0]) upstream [proxy:0:0@k1] got pmi command (from 11): get kvsname=kvs_11165_0 key=sharedFilename[0] [proxy:0:0@k1] forwarding command (cmd=get kvsname=kvs_11165_0 key=sharedFilename[0]) upstream [proxy:0:0@k1] we don't understand the response get_result; forwarding downstream [proxy:0:0@k1] got pmi command (from 14): get kvsname=kvs_11165_0 key=sharedFilename[0] [proxy:0:0@k1] forwarding command (cmd=get kvsname=kvs_11165_0 key=sharedFilename[0]) upstream [proxy:0:0@k1] got pmi command (from 17): get kvsname=kvs_11165_0 key=sharedFilename[0] [proxy:0:0@k1] forwarding command (cmd=get kvsname=kvs_11165_0 key=sharedFilename[0]) upstream [proxy:0:0@k1] got pmi command (from 20): get kvsname=kvs_11165_0 key=sharedFilename[0] [proxy:0:0@k1] forwarding command (cmd=get kvsname=kvs_11165_0 key=sharedFilename[0]) upstream [proxy:0:0@k1] we don't understand the response get_result; forwarding downstream [proxy:0:0@k1] we don't understand the response get_result; forwarding downstream [proxy:0:0@k1] we don't understand the response get_result; forwarding downstream [proxy:0:0@k1] we don't understand the response get_result; forwarding downstream [proxy:0:0@k1] we don't understand the response get_result; forwarding downstream [proxy:0:0@k1] got pmi command (from 4): put kvsname=kvs_11165_0 key=P0-businesscard value=description#k1$port#41142$ifname#134.102.241.191$ [proxy:0:0@k1] we don't understand this command put; forwarding upstream [proxy:0:0@k1] got pmi command (from 6): put kvsname=kvs_11165_0 key=P2-businesscard value=description#k1$port#56498$ifname#134.102.241.191$ [proxy:0:0@k1] we don't understand this command put; forwarding upstream [proxy:0:0@k1] got pmi command (from 11): put kvsname=kvs_11165_0 key=P3-businesscard value=description#k1$port#60939$ifname#134.102.241.191$ [proxy:0:0@k1] we don't understand this command put; forwarding upstream [proxy:0:0@k1] we don't understand the response put_result; forwarding downstream [proxy:0:0@k1] got pmi command (from 5): put kvsname=kvs_11165_0 key=P1-businesscard value=description#k1$port#52810$ifname#134.102.241.191$ [proxy:0:0@k1] we don't understand this command put; forwarding upstream [proxy:0:0@k1] got pmi command (from 17): put kvsname=kvs_11165_0 key=P5-businesscard value=description#k1$port#50672$ifname#134.102.241.191$ [proxy:0:0@k1] we don't understand this command put; forwarding upstream [proxy:0:0@k1] got pmi command (from 20): put kvsname=kvs_11165_0 key=P6-businesscard value=description#k1$port#60357$ifname#134.102.241.191$ [proxy:0:0@k1] we don't understand this command put; forwarding upstream [proxy:0:0@k1] we don't understand the response put_result; forwarding downstream [proxy:0:0@k1] we don't understand the response put_result; forwarding downstream [proxy:0:0@k1] we don't understand the response put_result; forwarding downstream [proxy:0:0@k1] got pmi command (from 6): barrier_in [proxy:0:0@k1] we don't understand the response put_result; forwarding downstream [proxy:0:0@k1] got pmi command (from 5): barrier_in [proxy:0:0@k1] got pmi command (from 11): barrier_in [proxy:0:0@k1] we don't understand the response put_result; forwarding downstream [proxy:0:0@k1] got pmi command (from 14): put kvsname=kvs_11165_0 key=P4-businesscard value=description#k1$port#38687$ifname#134.102.241.191$ [proxy:0:0@k1] we don't understand this command put; forwarding upstream [proxy:0:0@k1] got pmi command (from 20): barrier_in [proxy:0:0@k1] got pmi command (from 17): barrier_in [proxy:0:0@k1] we don't understand the response put_result; forwarding downstream [proxy:0:0@k1] got pmi command (from 4): barrier_in [proxy:0:0@k1] got pmi command (from 14): barrier_in [proxy:0:0@k1] forwarding command (cmd=barrier_in) upstream [proxy:0:0@k1] PMI response: cmd=barrier_out [proxy:0:0@k1] PMI response: cmd=barrier_out [proxy:0:0@k1] PMI response: cmd=barrier_out [proxy:0:0@k1] PMI response: cmd=barrier_out [proxy:0:0@k1] PMI response: cmd=barrier_out [proxy:0:0@k1] PMI response: cmd=barrier_out [proxy:0:0@k1] PMI response: cmd=barrier_out Process 0 of 7 is on k1 pi is approximately 3.1415926544231239, Error is 0.0000000008333307 wall clock time = 0.000111 Process 1 of 7 is on k1 Process 2 of 7 is on k1 Process 3 of 7 is on k1 Process 4 of 7 is on k1 Process 5 of 7 is on k1 Process 6 of 7 is on k1 APPLICATION TERMINATED WITH THE EXIT STRING: Terminated (signal 15)