[mpich-discuss] Hydra, Runtime error

Paul Hart pfhart at ameritech.net
Wed Jan 26 07:44:51 CST 2011


I have been using mpd as the process manager. I would like to change to
hydra since mpd is being deprecated. I compiled MPICH2-1.3.1 and was able to
run the cpi example program. I then attempted to run another program and
receive the following error (ran in verbose mode for more info). I am able
to run the same program using mpd.

 

To my knowledge no one in the community that uses this program (Fire
Dynamics Simulator, open source CFD tailored to fire, produced by community
lead by the National Institute of Standards and Technology) has attempted to
use hydra. They are still running on mpd.

 

Thanks,

 

Paul

 

============================================================================
======================
mpiexec options:
----------------
  Base path: /home/Paul/mpich2-1.3.1-install/bin/
  Bootstrap server: (null)
  Debug level: 1
  Enable X: -1

  Global environment:
  -------------------
    ORBIT_SOCKETDIR=/tmp/orbit-Paul
    SSH_AGENT_PID=1885
    HOSTNAME=T5400-2
    IMSETTINGS_INTEGRATE_DESKTOP=yes
    SHELL=/bin/bash
    TERM=xterm
    HISTSIZE=1000
 
XDG_SESSION_COOKIE=a5ad468bf429b1016713ff6a4c1a7fbe-1295906950.782328-554747
930
    GTK_RC_FILES=/etc/gtk/gtkrc:/home/Paul/.gtkrc-1.2-gnome2
    WINDOWID=67108867
    QTDIR=/usr/lib64/qt-3.3
    QTINC=/usr/lib64/qt-3.3/include
    IMSETTINGS_MODULE=none
    USER=Paul
    LS_COLORS=rs
    LD_LIBRARY_PATH=/opt/intel/Compiler/11.1/072/lib/intel64:
    SSH_AUTH_SOCK=/tmp/keyring-pXiYSA/socket.ssh
    GNOME_KEYRING_SOCKET=/tmp/keyring-pXiYSA/socket
 
SESSION_MANAGER=local/unix:@/tmp/.ICE-unix/1884,unix/unix:/tmp/.ICE-unix/188
4
 
PATH=/home/Paul/mpich2-1.3.1-install/bin:/usr/lib64/qt-3.3/bin:/usr/kerberos
/sbin:/usr/kerberos/bin:/usr/lib64/ccache:/usr/local/bin:/bin:/usr/bin:/usr/
local/sbin:/usr/sbin:/sbin:/home/Paul/bin
    MAIL=/var/spool/mail/Paul
    QT_IM_MODULE=xim
    PWD=/FDS/FDS5_TestCases/Hallway
    XMODIFIERS=@im
    KDE_IS_PRELINKED=1
    LANG=en_US.UTF-8
    KDEDIRS=/usr
    HISTCONTROL=ignoreboth
    SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass
    SHLVL=4
    HOME=/home/Paul
    KMIX_PULSEAUDIO_DISABLE=1
    GNOME_DESKTOP_SESSION_ID=this-is-deprecated
    LOGNAME=Paul
    CVS_RSH=ssh
    QTLIB=/usr/lib64/qt-3.3/lib
    DBUS_SESSION_BUS_ADDRESS=unix:abstract
    LESSOPEN=|/usr/bin/lesspipe.sh %s
    WINDOWPATH=7
    GTK_IM_MODULE=xim
    G_BROKEN_FILENAMES=1
    COLORTERM=gnome-terminal
    XAUTHORITY=/home/Paul/.Xauthority
    _=/home/Paul/mpich2-1.3.1-install/bin/mpiexec


    Proxy information:
    *********************
      Proxy ID:  1
      -----------------
        Proxy name: localhost
        Process count: 1
        Start PID: 0

        Proxy exec list:
        ....................
          Exec: /FDS/FDS5_TestCases/Hallway/fds5_mpi_intel_linux_64_hydra;
Process count: 1
          Exec: /FDS/FDS5_TestCases/Hallway/fds5_mpi_intel_linux_64_hydra;
Process count: 1
          Exec: /FDS/FDS5_TestCases/Hallway/fds5_mpi_intel_linux_64_hydra;
Process count: 1
          Exec: /FDS/FDS5_TestCases/Hallway/fds5_mpi_intel_linux_64_hydra;
Process count: 1
          Exec: /FDS/FDS5_TestCases/Hallway/fds5_mpi_intel_linux_64_hydra;
Process count: 1

============================================================================
======================

[mpiexec at T5400-2] Timeout set to -1 (-1 means infinite)
[mpiexec at T5400-2] Got a control port string of T5400-2:52405

Proxy launch args: /home/Paul/mpich2-1.3.1-install/bin/hydra_pmi_proxy
--control-port T5400-2:52405 --debug --demux poll --pgid 0 --enable-stdin 1
--proxy-id 

[mpiexec at T5400-2] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1
Arguments being passed to proxy 0:
--version 1.3.1 --interface-env-name MPICH_INTERFACE_HOSTNAME --hostname
localhost --global-core-count 1 --global-process-count 5 --auto-cleanup 1
--pmi-rank -1 --pmi-kvsname kvs_4399_0 --pmi-process-mapping
(vector,(0,1,1)) --ckpoint-num -1 --global-inherited-env 44
'ORBIT_SOCKETDIR=/tmp/orbit-Paul' 'SSH_AGENT_PID=1885' 'HOSTNAME=T5400-2'
'IMSETTINGS_INTEGRATE_DESKTOP=yes' 'SHELL=/bin/bash' 'TERM=xterm'
'HISTSIZE=1000'
'XDG_SESSION_COOKIE=a5ad468bf429b1016713ff6a4c1a7fbe-1295906950.782328-55474
7930' 'GTK_RC_FILES=/etc/gtk/gtkrc:/home/Paul/.gtkrc-1.2-gnome2'
'WINDOWID=67108867' 'QTDIR=/usr/lib64/qt-3.3'
'QTINC=/usr/lib64/qt-3.3/include' 'IMSETTINGS_MODULE=none' 'USER=Paul'
'LS_COLORS=rs' 'LD_LIBRARY_PATH=/opt/intel/Compiler/11.1/072/lib/intel64:'
'SSH_AUTH_SOCK=/tmp/keyring-pXiYSA/socket.ssh'
'GNOME_KEYRING_SOCKET=/tmp/keyring-pXiYSA/socket'
'SESSION_MANAGER=local/unix:@/tmp/.ICE-unix/1884,unix/unix:/tmp/.ICE-unix/18
84'
'PATH=/home/Paul/mpich2-1.3.1-install/bin:/usr/lib64/qt-3.3/bin:/usr/kerbero
s/sbin:/usr/kerberos/bin:/usr/lib64/ccache:/usr/local/bin:/bin:/usr/bin:/usr
/local/sbin:/usr/sbin:/sbin:/home/Paul/bin' 'MAIL=/var/spool/mail/Paul'
'QT_IM_MODULE=xim' 'PWD=/FDS/FDS5_TestCases/Hallway' 'XMODIFIERS=@im'
'KDE_IS_PRELINKED=1' 'LANG=en_US.UTF-8' 'KDEDIRS=/usr'
'HISTCONTROL=ignoreboth'
'SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass' 'SHLVL=4'
'HOME=/home/Paul' 'KMIX_PULSEAUDIO_DISABLE=1'
'GNOME_DESKTOP_SESSION_ID=this-is-deprecated' 'LOGNAME=Paul' 'CVS_RSH=ssh'
'QTLIB=/usr/lib64/qt-3.3/lib' 'DBUS_SESSION_BUS_ADDRESS=unix:abstract'
'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'WINDOWPATH=7' 'GTK_IM_MODULE=xim'
'G_BROKEN_FILENAMES=1' 'COLORTERM=gnome-terminal'
'XAUTHORITY=/home/Paul/.Xauthority'
'_=/home/Paul/mpich2-1.3.1-install/bin/mpiexec' --global-user-env 0
--global-system-env 0 --start-pid 0 --proxy-core-count 1 --exec
--exec-appnum 0 --exec-proc-count 1 --exec-local-env 0 --exec-wdir
/FDS/FDS5_TestCases/Hallway --exec-args 2
/FDS/FDS5_TestCases/Hallway/fds5_mpi_intel_linux_64_hydra
/FDS/FDS5_TestCases/Hallway/hallways.fds --exec --exec-appnum 0
--exec-proc-count 1 --exec-local-env 0 --exec-wdir
/FDS/FDS5_TestCases/Hallway --exec-args 2
/FDS/FDS5_TestCases/Hallway/fds5_mpi_intel_linux_64_hydra
/FDS/FDS5_TestCases/Hallway/hallways.fds --exec --exec-appnum 0
--exec-proc-count 1 --exec-local-env 0 --exec-wdir
/FDS/FDS5_TestCases/Hallway --exec-args 2
/FDS/FDS5_TestCases/Hallway/fds5_mpi_intel_linux_64_hydra
/FDS/FDS5_TestCases/Hallway/hallways.fds --exec --exec-appnum 0
--exec-proc-count 1 --exec-local-env 0 --exec-wdir
/FDS/FDS5_TestCases/Hallway --exec-args 2
/FDS/FDS5_TestCases/Hallway/fds5_mpi_intel_linux_64_hydra
/FDS/FDS5_TestCases/Hallway/hallways.fds --exec --exec-appnum 0
--exec-proc-count 1 --exec-local-env 0 --exec-wdir
/FDS/FDS5_TestCases/Hallway --exec-args 2
/FDS/FDS5_TestCases/Hallway/fds5_mpi_intel_linux_64_hydra
/FDS/FDS5_TestCases/Hallway/hallways.fds 

[mpiexec at T5400-2] Launch arguments:
/home/Paul/mpich2-1.3.1-install/bin/hydra_pmi_proxy --control-port
T5400-2:52405 --debug --demux poll --pgid 0 --enable-stdin 1 --proxy-id 0 
[proxy:0:0 at T5400-2] got pmi command (from 17): init
pmi_version=1 pmi_subversion=1 
[proxy:0:0 at T5400-2] PMI response: cmd=response_to_init pmi_version=1
pmi_subversion=1 rc=0
[proxy:0:0 at T5400-2] got pmi command (from 6): init
pmi_version=1 pmi_subversion=1 
[proxy:0:0 at T5400-2] PMI response: cmd=response_to_init pmi_version=1
pmi_subversion=1 rc=0
[proxy:0:0 at T5400-2] got pmi command (from 10): init
pmi_version=1 pmi_subversion=1 
[proxy:0:0 at T5400-2] PMI response: cmd=response_to_init pmi_version=1
pmi_subversion=1 rc=0
[proxy:0:0 at T5400-2] got pmi command (from 20): init
pmi_version=1 pmi_subversion=1 
[proxy:0:0 at T5400-2] PMI response: cmd=response_to_init pmi_version=1
pmi_subversion=1 rc=0
[proxy:0:0 at T5400-2] got pmi command (from 6): get_maxes

[proxy:0:0 at T5400-2] PMI response: cmd=maxes kvsname_max=256 keylen_max=64
vallen_max=1024
[proxy:0:0 at T5400-2] got pmi command (from 9): init
pmi_version=1 pmi_subversion=1 
[proxy:0:0 at T5400-2] PMI response: cmd=response_to_init pmi_version=1
pmi_subversion=1 rc=0
[proxy:0:0 at T5400-2] got pmi command (from 10): get_maxes

[proxy:0:0 at T5400-2] PMI response: cmd=maxes kvsname_max=256 keylen_max=64
vallen_max=1024
[proxy:0:0 at T5400-2] got pmi command (from 6): get_appnum

[proxy:0:0 at T5400-2] PMI response: cmd=appnum appnum=0
[proxy:0:0 at T5400-2] got pmi command (from 9): get_maxes

[proxy:0:0 at T5400-2] PMI response: cmd=maxes kvsname_max=256 keylen_max=64
vallen_max=1024
[proxy:0:0 at T5400-2] got pmi command (from 20): get_maxes

[proxy:0:0 at T5400-2] PMI response: cmd=maxes kvsname_max=256 keylen_max=64
vallen_max=1024
[proxy:0:0 at T5400-2] got pmi command (from 6): get_my_kvsname

[proxy:0:0 at T5400-2] PMI response: cmd=my_kvsname kvsname=kvs_4399_0
[proxy:0:0 at T5400-2] got pmi command (from 9): get_appnum

[proxy:0:0 at T5400-2] PMI response: cmd=appnum appnum=0
[proxy:0:0 at T5400-2] got pmi command (from 10): get_appnum

[proxy:0:0 at T5400-2] PMI response: cmd=appnum appnum=0
[proxy:0:0 at T5400-2] got pmi command (from 6): get_my_kvsname

[proxy:0:0 at T5400-2] PMI response: cmd=my_kvsname kvsname=kvs_4399_0
[proxy:0:0 at T5400-2] got pmi command (from 9): get_my_kvsname

[proxy:0:0 at T5400-2] PMI response: cmd=my_kvsname kvsname=kvs_4399_0
[proxy:0:0 at T5400-2] got pmi command (from 20): get_appnum

[proxy:0:0 at T5400-2] PMI response: cmd=appnum appnum=0
[proxy:0:0 at T5400-2] got pmi command (from 6): get
kvsname=kvs_4399_0 key=PMI_process_mapping 
[proxy:0:0 at T5400-2] PMI response: cmd=get_result rc=0 msg=success
value=(vector,(0,1,1))
[proxy:0:0 at T5400-2] got pmi command (from 10): get_my_kvsname

[proxy:0:0 at T5400-2] PMI response: cmd=my_kvsname kvsname=kvs_4399_0
[proxy:0:0 at T5400-2] got pmi command (from 17): get_maxes

[proxy:0:0 at T5400-2] PMI response: cmd=maxes kvsname_max=256 keylen_max=64
vallen_max=1024
[proxy:0:0 at T5400-2] got pmi command (from 9): get_my_kvsname

[proxy:0:0 at T5400-2] PMI response: cmd=my_kvsname kvsname=kvs_4399_0
[proxy:0:0 at T5400-2] got pmi command (from 20): get_my_kvsname

[proxy:0:0 at T5400-2] PMI response: cmd=my_kvsname kvsname=kvs_4399_0
[proxy:0:0 at T5400-2] got pmi command (from 10): get_my_kvsname

[proxy:0:0 at T5400-2] PMI response: cmd=my_kvsname kvsname=kvs_4399_0
[proxy:0:0 at T5400-2] got pmi command (from 17): get_appnum

[proxy:0:0 at T5400-2] PMI response: cmd=appnum appnum=0
[proxy:0:0 at T5400-2] got pmi command (from 9): get
kvsname=kvs_4399_0 key=PMI_process_mapping 
[proxy:0:0 at T5400-2] PMI response: cmd=get_result rc=0 msg=success
value=(vector,(0,1,1))
[proxy:0:0 at T5400-2] got pmi command (from 20): get_my_kvsname

[proxy:0:0 at T5400-2] PMI response: cmd=my_kvsname kvsname=kvs_4399_0
[proxy:0:0 at T5400-2] got pmi command (from 10): get
kvsname=kvs_4399_0 key=PMI_process_mapping 
[proxy:0:0 at T5400-2] PMI response: cmd=get_result rc=0 msg=success
value=(vector,(0,1,1))
[proxy:0:0 at T5400-2] got pmi command (from 17): get_my_kvsname

[proxy:0:0 at T5400-2] PMI response: cmd=my_kvsname kvsname=kvs_4399_0
[proxy:0:0 at T5400-2] got pmi command (from 9): barrier_in

[proxy:0:0 at T5400-2] got pmi command (from 20): get
kvsname=kvs_4399_0 key=PMI_process_mapping 
[proxy:0:0 at T5400-2] PMI response: cmd=get_result rc=0 msg=success
value=(vector,(0,1,1))
[proxy:0:0 at T5400-2] got pmi command (from 10): barrier_in

[proxy:0:0 at T5400-2] got pmi command (from 17): get_my_kvsname

[proxy:0:0 at T5400-2] PMI response: cmd=my_kvsname kvsname=kvs_4399_0
[proxy:0:0 at T5400-2] got pmi command (from 6): put
kvsname=kvs_4399_0 key=sharedFilename[0] value=/dev/shm/mpich_shar_tmpj4s40m

[proxy:0:0 at T5400-2] we don't understand this command put; forwarding
upstream
[mpiexec at T5400-2] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4399_0
key=sharedFilename[0] value=/dev/shm/mpich_shar_tmpj4s40m
[mpiexec at T5400-2] PMI response to fd 6 pid 6: cmd=put_result rc=0
msg=success
[proxy:0:0 at T5400-2] got pmi command (from 17): get
kvsname=kvs_4399_0 key=PMI_process_mapping 
[proxy:0:0 at T5400-2] PMI response: cmd=get_result rc=0 msg=success
value=(vector,(0,1,1))
[proxy:0:0 at T5400-2] got pmi command (from 20): barrier_in

[proxy:0:0 at T5400-2] we don't understand the response put_result; forwarding
downstream
[proxy:0:0 at T5400-2] got pmi command (from 17): barrier_in

[proxy:0:0 at T5400-2] got pmi command (from 6): barrier_in

[mpiexec at T5400-2] [pgid: 0] got PMI command: cmd=barrier_in
[mpiexec at T5400-2] PMI response to fd 6 pid 6: cmd=barrier_out
[proxy:0:0 at T5400-2] forwarding command (cmd=barrier_in) upstream
[proxy:0:0 at T5400-2] PMI response: cmd=barrier_out
[proxy:0:0 at T5400-2] PMI response: cmd=barrier_out
[proxy:0:0 at T5400-2] PMI response: cmd=barrier_out
[proxy:0:0 at T5400-2] PMI response: cmd=barrier_out
[proxy:0:0 at T5400-2] PMI response: cmd=barrier_out
[proxy:0:0 at T5400-2] got pmi command (from 10): get
kvsname=kvs_4399_0 key=sharedFilename[0] 
[mpiexec at T5400-2] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4399_0
key=sharedFilename[0]
[mpiexec at T5400-2] PMI response to fd 6 pid 10: cmd=get_result rc=0
msg=success value=/dev/shm/mpich_shar_tmpj4s40m
[proxy:0:0 at T5400-2] forwarding command (cmd=get kvsname=kvs_4399_0
key=sharedFilename[0]) upstream
[proxy:0:0 at T5400-2] got pmi command (from 20): get
kvsname=kvs_4399_0 key=sharedFilename[0] 
[proxy:0:0 at T5400-2] forwarding command (cmd=get kvsname=kvs_4399_0
key=sharedFilename[0]) upstream
[mpiexec at T5400-2] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4399_0
key=sharedFilename[0]
[mpiexec at T5400-2] PMI response to fd 6 pid 20: cmd=get_result rc=0
msg=success value=/dev/shm/mpich_shar_tmpj4s40m
[proxy:0:0 at T5400-2] we don't understand the response get_result; forwarding
downstream
[proxy:0:0 at T5400-2] we don't understand the response get_result; forwarding
downstream
[proxy:0:0 at T5400-2] got pmi command (from 9): get
kvsname=kvs_4399_0 key=sharedFilename[0] 
[mpiexec at T5400-2] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4399_0
key=sharedFilename[0]
[mpiexec at T5400-2] PMI response to fd 6 pid 9: cmd=get_result rc=0
msg=success value=/dev/shm/mpich_shar_tmpj4s40m
[proxy:0:0 at T5400-2] forwarding command (cmd=get kvsname=kvs_4399_0
key=sharedFilename[0]) upstream
[proxy:0:0 at T5400-2] got pmi command (from 17): get
kvsname=kvs_4399_0 key=sharedFilename[0] 
[proxy:0:0 at T5400-2] forwarding command (cmd=get kvsname=kvs_4399_0
key=sharedFilename[0]) upstream
[mpiexec at T5400-2] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4399_0
key=sharedFilename[0]
[mpiexec at T5400-2] PMI response to fd 6 pid 17: cmd=get_result rc=0
msg=success value=/dev/shm/mpich_shar_tmpj4s40m
[proxy:0:0 at T5400-2] we don't understand the response get_result; forwarding
downstream
[proxy:0:0 at T5400-2] we don't understand the response get_result; forwarding
downstream
[proxy:0:0 at T5400-2] got pmi command (from 6): put
kvsname=kvs_4399_0 key=P0-businesscard
value=description#localhost$port#34688$ifname#127.0.0.1$ 
[proxy:0:0 at T5400-2] we don't understand this command put; forwarding
upstream
[mpiexec at T5400-2] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4399_0
key=P0-businesscard value=description#localhost$port#34688$ifname#127.0.0.1$
[mpiexec at T5400-2] PMI response to fd 6 pid 6: cmd=put_result rc=0
msg=success
[proxy:0:0 at T5400-2] got pmi command (from 9): put
kvsname=kvs_4399_0 key=P1-businesscard
value=description#localhost$port#59785$ifname#127.0.0.1$ 
[proxy:0:0 at T5400-2] we don't understand this command put; forwarding
upstream
[proxy:0:0 at T5400-2] got pmi command (from 10): put
kvsname=kvs_4399_0 key=P2-businesscard
value=description#localhost$port#41396$ifname#127.0.0.1$ 
[proxy:0:0 at T5400-2] we don't understand this command put; forwarding
upstream
[mpiexec at T5400-2] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4399_0
key=P1-businesscard value=description#localhost$port#59785$ifname#127.0.0.1$
[mpiexec at T5400-2] PMI response to fd 6 pid 9: cmd=put_result rc=0
msg=success
[proxy:0:0 at T5400-2] got pmi command (from 17): put
kvsname=kvs_4399_0 key=P3-businesscard
value=description#localhost$port#58059$ifname#127.0.0.1$ 
[proxy:0:0 at T5400-2] we don't understand this command put; forwarding
upstream
[proxy:0:0 at T5400-2] got pmi command (from 20): put
kvsname=kvs_4399_0 key=P4-businesscard
value=description#localhost$port#50928$ifname#127.0.0.1$ 
[proxy:0:0 at T5400-2] we don't understand this command put; forwarding
upstream
[mpiexec at T5400-2] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4399_0
key=P2-businesscard value=description#localhost$port#41396$ifname#127.0.0.1$
[mpiexec at T5400-2] PMI response to fd 6 pid 10: cmd=put_result rc=0
msg=success
[proxy:0:0 at T5400-2] we don't understand the response put_result; forwarding
downstream
[proxy:0:0 at T5400-2] we don't understand the response put_result; forwarding
downstream
[proxy:0:0 at T5400-2] got pmi command (from 9): barrier_in

[mpiexec at T5400-2] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4399_0
key=P3-businesscard value=description#localhost$port#58059$ifname#127.0.0.1$
[mpiexec at T5400-2] PMI response to fd 6 pid 17: cmd=put_result rc=0
msg=success
[proxy:0:0 at T5400-2] we don't understand the response put_result; forwarding
downstream
[proxy:0:0 at T5400-2] got pmi command (from 6): barrier_in

[proxy:0:0 at T5400-2] got pmi command (from 10): barrier_in

[mpiexec at T5400-2] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4399_0
key=P4-businesscard value=description#localhost$port#50928$ifname#127.0.0.1$
[mpiexec at T5400-2] PMI response to fd 6 pid 20: cmd=put_result rc=0
msg=success
[proxy:0:0 at T5400-2] we don't understand the response put_result; forwarding
downstream
[proxy:0:0 at T5400-2] we don't understand the response put_result; forwarding
downstream
[proxy:0:0 at T5400-2] got pmi command (from 17): barrier_in

[proxy:0:0 at T5400-2] got pmi command (from 20): barrier_in

[proxy:0:0 at T5400-2] forwarding command (cmd=barrier_in) upstream
[mpiexec at T5400-2] [pgid: 0] got PMI command: cmd=barrier_in
[mpiexec at T5400-2] PMI response to fd 6 pid 20: cmd=barrier_out
[proxy:0:0 at T5400-2] PMI response: cmd=barrier_out
[proxy:0:0 at T5400-2] PMI response: cmd=barrier_out
[proxy:0:0 at T5400-2] PMI response: cmd=barrier_out
[proxy:0:0 at T5400-2] PMI response: cmd=barrier_out
[proxy:0:0 at T5400-2] PMI response: cmd=barrier_out
Process   2 of   4 is running on T5400-2
Process   1 of   4 is running on T5400-2
Process   3 of   4 is running on T5400-2
Process   4 of   4 is running on T5400-2
Process   0 of   4 is running on T5400-2
Mesh   1 is assigned to Process   0
Mesh   2 is assigned to Process   1
Mesh   3 is assigned to Process   2
Mesh   4 is assigned to Process   3
Mesh   5 is assigned to Process   4

 Fire Dynamics Simulator

 Compilation Date : Fri, 29 Oct 2010

 Version: 5.5.3; MPI Enabled; OpenMP Disabled
 SVN Revision No. : 7031

 Job TITLE        : Silly Multi-Mesh Test, SVN $Revision: 6486 $
 Job ID string    : hallways

Fatal error in PMPI_Gatherv: Internal MPI error!, error stack:
PMPI_Gatherv(376).....: MPI_Gatherv failed(sbuf=0x27a6c40, scount=1,
MPI_DOUBLE_PRECISION, rbuf=0x27a6c40, rcnts=0x25b9670, displs=0x25b96f0,
MPI_DOUBLE_PRECISION, root=0, MPI_COMM_WORLD) failed
MPIR_Gatherv_impl(189): 
MPIR_Gatherv(102).....: 
MPIR_Localcopy(346)...: memcpy arguments alias each other, dst=0x27a6c40
src=0x27a6c40 len=8
APPLICATION TERMINATED WITH THE EXIT STRING: Hangup (signal 1)

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/mpich-discuss/attachments/20110126/9eb1697f/attachment-0001.htm>


More information about the mpich-discuss mailing list