[mpich-discuss] Needs help for launching fortran executable and python with mpiexec on a Linux cluster

Truong V. Khiem truongvk50 at yahoo.fr
Tue Jul 26 14:02:03 CDT 2011


Hello,

    I have some trouble for launching fortran executable and python file with mpiexec on a Linux cluster: the executable is 
elaborated with ifort linking the MSC Marc code with fortran user subroutine, the python code includes mpi4py (linking python 
with MPI).
    The output of the Linux cluster is the following (complete output at the end):

Fatal error in MPI_Init_thread: Other MPI error, error stack:
MPIR_Init_thread(394).....: Initialization failed
MPID_Init(118)............: channel initialization failed
MPIDI_CH3_Init(43)........:
MPID_nem_init(202)........:
MPIDI_CH3I_Seg_commit(363): PMI_KVS_Get returned -1
Killed (signal 9)

    Let me explain more about the codes used:

  (1) the code MSC Marc is linked with MPI shipped by MSC Software, basically a INtel MPI;

  (2) the python code involves MPICH code, built by me. According to the FAQ of MPICH2, it would work with code based on Intel 
MPI. I have especially taken an old version of MPICH, version 1.2 for not getting into incompatibility with Intel MPI (the 
common mpif.h are compatible).

  
    The above procedure has worked for a HP cluster. For this machine, the MPI code is the same for MSC Marc and python code,
namely HPMPI. Some tests have been made for the installation on the new Linux cluster:

 (1) the folowing instruction works:
  mpiexec -np 1 small-fortran (excutable) : -np 1 python small-python-code.py

 (2) same for:
  mpiexec -np 1 script.marc (excutable of MSC Marc)

  (3) same for:
  mpiexec -np 1 script.marc : -np 1 script.marc

  (4) same for:
  mpiexec -np 1 small-fortran : -np 1 small-fortran

   So what's is wrong for 

  mpiexec -np 1 script.marc  : -np 1 script.marc 

     Would somebody have any idea, I will be indebted to hear about it.

    Recently, I have sent an e-mail to Intel MPI forum and I got an answer from Intel people that Hydra 
(process manager) communication was modified and therefore it is likely that it is incompatible with MPICH2. What can
I do to get my computation running with the Linux cluster? 

    Regards,

    Helires


========================Complete output

mpiexec  --verbose -genv I_MPI_FALLBACK_DEVICE 0 -np 1 /wrk3/helires/CFD_ADM/CouplingScript3D_simple.marc -jid mod4_rotor_adm 
-dirjid /wrk3/helires/CFD_ADM -maxnum 1000000 -nthread 1 -dirjob /wrk3/helires/CFD_ADM -ml 5000 -ci yes -cr yes : -np 1  
/wrk3/helires/bin/Python-2.7_install/bin/python2.7-mpi /wrk3/helires/CFD_ADM/MasterMarc3D_simple.py
[mpiexec at bigblue]
[mpiexec at bigblue] =================================================[mpiexec at bigblue] ==========================================
=======[mpiexec at bigblue]
[mpiexec at bigblue] mpiexec options:
[mpiexec at bigblue] ----------------
[mpiexec at bigblue]   Base path: /wrk3/helires/bin/mpich2-install/bin/
[mpiexec at bigblue]   Proxy port: 9899
[mpiexec at bigblue]   Bootstrap server: ssh
[mpiexec at bigblue]   Debug level: 1
[mpiexec at bigblue]   Enable X: -1
[mpiexec at bigblue]   Working dir: /wrk3/helires/CFD_ADM
[mpiexec at bigblue]   Host file: HYDRA_USE_LOCALHOST
[mpiexec at bigblue]
[mpiexec at bigblue]   Global environment:
[mpiexec at bigblue]   -------------------
[mpiexec at bigblue]     REMOTEHOST=nanopus.frlab
[mpiexec at bigblue]     HOSTNAME=bigblue
[mpiexec at bigblue]     MSC_LICENSE_NOQUEUE=yes
[mpiexec at bigblue]     HOST=bigblue
[mpiexec at bigblue]     TERM=dtterm
[mpiexec at bigblue]     SHELL=/bin/csh
[mpiexec at bigblue]     SSH_CLIENT=125.1.5.218 62405 22
[mpiexec at bigblue]     MSC_LICENCE_FILE=1700 at adriatic
[mpiexec at bigblue]     QTDIR=/usr/lib64/qt-3.3
[mpiexec at bigblue]     QTINC=/usr/lib64/qt-3.3/include
[mpiexec at bigblue]     SSH_TTY=/dev/pts/4
[mpiexec at bigblue]     GROUP=DADS
[mpiexec at bigblue]     USER=helires
[mpiexec at bigblue]     LD_LIBRARY_PATH=/wrk3/helires/bin/mpich2-install/lib:/opt/intel/cce/11.1/073/lib/intel64:/opt/intel/fce/10.1.026/lib:/wrk3/helires/bin/marc/marc2010.2/intelmpi/linux64/lib64/:/wrk3/helires/bin/marc/marc2010.2/lib/linux64i8:/wrk3/helires/bin/marc/marc2010.2/lib_shared/linux64:/wrk3/helires/bin/Python-2.7_install/lib/python2.7:/lib64:/usr/lib64/:
[mpiexec at bigblue]     LS_COLORS=no
[mpiexec at bigblue]     HOSTTYPE=x86_64-linux
[mpiexec at bigblue]     KDEDIR=/usr
[mpiexec at bigblue]     MAIL=/var/spool/mail/helires
[mpiexec at bigblue]     PATH=.:/wrk3/helires/bin/mpich2-install/bin:/wrk3/helires/bin/Python-2.7_install/bin:/wrk3/helires/bin/mpich2-install/bin:/wrk3/helires/bin/util:/usr/lib64/qt-3.3/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin
[mpiexec at bigblue]     INPUTRC=/etc/inputrc
[mpiexec at bigblue]     PWD=/wrk3/helires/CFD_ADM
[mpiexec at bigblue]     LANG=fr_FR.UTF-8
[mpiexec at bigblue]     KDE_IS_PRELINKED=1
[mpiexec at bigblue]     PS1=`hostname`>>
[mpiexec at bigblue]     LM_LICENSE_FILE=1700 at adriatic
[mpiexec at bigblue]     SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass
[mpiexec at bigblue]     SHLVL=3
[mpiexec at bigblue]     HOME=/wrk3/helires
[mpiexec at bigblue]     OSTYPE=linux
[mpiexec at bigblue]     CFLAGS=-fPIC
[mpiexec at bigblue]     VENDOR=unknown
[mpiexec at bigblue]     PYTHONPATH=/wrk3/helires/bin/Python-2.7_install/lib/python2.7:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/plat-linux2:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/lib-tk:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/lib-dynload:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/site-packages
[mpiexec at bigblue]     MACHTYPE=x86_64
[mpiexec at bigblue]     LOGNAME=helires
[mpiexec at bigblue]     QTLIB=/usr/lib64/qt-3.3/lib
[mpiexec at bigblue]     SSH_CONNECTION=125.1.5.218 62405 125.1.6.45 22
[mpiexec at bigblue]     LESSOPEN=|/usr/bin/lesspipe.sh %s
[mpiexec at bigblue]     DISPLAY=hudson:0
[mpiexec at bigblue]     G_BROKEN_FILENAMES=1
[mpiexec at bigblue]     OLDPWD=/wrk3/helires
[mpiexec at bigblue]     _=/bin/csh
[mpiexec at bigblue]
[mpiexec at bigblue]   User set environment:
[mpiexec at bigblue]   ---------------------
[mpiexec at bigblue]     I_MPI_FALLBACK_DEVICE=0
[mpiexec at bigblue]

[mpiexec at bigblue]     Executable information:
[mpiexec at bigblue]     **********************
[mpiexec at bigblue]       Executable ID:  1
[mpiexec at bigblue]       -----------------
[mpiexec at bigblue]         Process count: 1
[mpiexec at bigblue]         Executable: /wrk3/helires/CFD_ADM/CouplingScript3D_simple.marc -jid mod4_rotor_adm -dirjid /wrk3/helires/CFD_ADM -maxnum 1000000 -nthread 1 -dirjob /wrk3/helires/CFD_ADM -ml 5000 -ci yes -cr yes
[mpiexec at bigblue]
[mpiexec at bigblue]       Executable ID:  2
[mpiexec at bigblue]       -----------------
[mpiexec at bigblue]         Process count: 1
[mpiexec at bigblue]         Executable: /wrk3/helires/bin/Python-2.7_install/bin/python2.7-mpi /wrk3/helires/CFD_ADM/MasterMarc3D_simple.py
[mpiexec at bigblue]
[mpiexec at bigblue]     Partition information:
[mpiexec at bigblue]     *********************
[mpiexec at bigblue]       Partition ID:  1
[mpiexec at bigblue]       -----------------
[mpiexec at bigblue]         Partition name: localhost
[mpiexec at bigblue]         Process count: 1
[mpiexec at bigblue]
[mpiexec at bigblue]         Partition segment list:
[mpiexec at bigblue]         .......................
[mpiexec at bigblue]           Start PID: 0; Process count: 1
[mpiexec at bigblue]
[mpiexec at bigblue]         Partition exec list:
[mpiexec at bigblue]         ....................
[mpiexec at bigblue]           Exec: /wrk3/helires/CFD_ADM/CouplingScript3D_simple.marc; Process count: 1
[mpiexec at bigblue]           Exec: /wrk3/helires/bin/Python-2.7_install/bin/python2.7-mpi; Process count: 1
[mpiexec at bigblue]
[mpiexec at bigblue] =================================================[mpiexec at bigblue] =================================================[mpiexec at bigblue]

[mpiexec at bigblue] Timeout set to -1 (-1 means infinite)
[mpiexec at bigblue] Got a PMI port string of bigblue:39414
[mpiexec at bigblue] Got a proxy port string of bigblue:51839
Arguments being passed to proxy 0:
--global-core-count 1 --wdir /wrk3/helires/CFD_ADM --pmi-port-str bigblue:39414 --binding HYDRA_NULL HYDRA_NULL --bindlib plpa --ckpointlib none --ckpoint-prefix HYDRA_NULL --global-inherited-env 41 'REMOTEHOST=nanopus.frlab' 'HOSTNAME=bigblue' 'MSC_LICENSE_NOQUEUE=yes' 'HOST=bigblue' 'TERM=dtterm' 'SHELL=/bin/csh' 'SSH_CLIENT=125.1.5.218 62405 22' 'MSC_LICENCE_FILE=1700 at adriatic' 'QTDIR=/usr/lib64/qt-3.3' 'QTINC=/usr/lib64/qt-3.3/include' 'SSH_TTY=/dev/pts/4' 'GROUP=DADS' 'USER=helires' 'LD_LIBRARY_PATH=/wrk3/helires/bin/mpich2-install/lib:/opt/intel/cce/11.1/073/lib/intel64:/opt/intel/fce/10.1.026/lib:/wrk3/helires/bin/marc/marc2010.2/intelmpi/linux64/lib64/:/wrk3/helires/bin/marc/marc2010.2/lib/linux64i8:/wrk3/helires/bin/marc/marc2010.2/lib_shared/linux64:/wrk3/helires/bin/Python-2.7_install/lib/python2.7:/lib64:/usr/lib64/:' 'LS_COLORS=no' 'HOSTTYPE=x86_64-linux' 'KDEDIR=/usr' 'MAIL=/var/spool/mail/helires'
 'PATH=.:/wrk3/helires/bin/mpich2-install/bin:/wrk3/helires/bin/Python-2.7_install/bin:/wrk3/helires/bin/mpich2-install/bin:/wrk3/helires/bin/util:/usr/lib64/qt-3.3/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin' 'INPUTRC=/etc/inputrc' 'PWD=/wrk3/helires/CFD_ADM' 'LANG=fr_FR.UTF-8' 'KDE_IS_PRELINKED=1' 'PS1=`hostname`>> ' 'LM_LICENSE_FILE=1700 at adriatic' 'SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass' 'SHLVL=3' 'HOME=/wrk3/helires' 'OSTYPE=linux' 'CFLAGS=-fPIC' 'VENDOR=unknown' 'PYTHONPATH=/wrk3/helires/bin/Python-2.7_install/lib/python2.7:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/plat-linux2:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/lib-tk:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/lib-dynload:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/site-packages' 'MACHTYPE=x86_64' 'LOGNAME=helires' 'QTLIB=/usr/lib64/qt-3.3/lib' 'SSH_CONNECTION=125.1.5.218 62405 125.1.6.45 22' 'LESSOPEN=|/usr/bin/lesspipe.sh
 %s' 'DISPLAY=hudson:0' 'G_BROKEN_FILENAMES=1' 'OLDPWD=/wrk3/helires' '_=/bin/csh' --global-user-env 1 'I_MPI_FALLBACK_DEVICE=0' --global-system-env 0 --genv-prop 1 --segment --segment-start-pid 0 --segment-proc-count 1 --exec --exec-proc-count 1 --exec-local-env 0 --exec-env-prop 0 /wrk3/helires/CFD_ADM/CouplingScript3D_simple.marc -jid mod4_rotor_adm -dirjid /wrk3/helires/CFD_ADM -maxnum 1000000 -nthread 1 -dirjob /wrk3/helires/CFD_ADM -ml 5000 -ci yes -cr yes --exec --exec-proc-count 1 --exec-local-env 0 --exec-env-prop 0 /wrk3/helires/bin/Python-2.7_install/bin/python2.7-mpi /wrk3/helires/CFD_ADM/MasterMarc3D_simple.py

[mpiexec at bigblue] Launching process: /usr/bin/ssh -x localhost /wrk3/helires/bin/mpich2-install/bin/pmi_proxy --launch-mode 1 --proxy-port bigblue:51839 --debug --bootstrap ssh --partition-id 0
helires at localhost's password:
Marc mod4_rotor_adm begins execution

     (c) COPYRIGHT 2011 MSC.Software Corporation, all rights reserved                                                                                               


VERSION: Marc,  Version, Build, Date                                                                                                                                     



     Date: Fri Jul 22 15:02:20 2011

                              Marc execution begins
Date:          Fri Jul 22 15:02:21 2011
MSC Id:        0017a4770030 (ethernet) (Linux)
Hostname:      bigblue (user helires, display )
License files: 1700 at adriatic
CEID:          77F66039-BC7GFC45
User:          helires
Display:       
LAPI Version:  LAPI 8.3.1-2041 (FLEXlm 10.8.6.0)
Acquired 160 licenses for Group CAMPUS (Marc) from license server on host adriatic


             general memory initially set to =        25 MByte

             maximum available memory set to =      5000 MByte

             general memory increasing from      25 MByte to     106 MByte

   MSC Customer Entitlement ID
        77F66039-BC7GFC45

             wall time =           2.52

             wall time =           3.60

             general memory increasing from     106 MByte to     552 MByte
Appel SB UBGINC!
flag= F ;ierr=                     0
Fatal error in MPI_Init_thread: Other MPI error, error stack:
MPIR_Init_thread(394).....: Initialization failed
MPID_Init(118)............: channel initialization failed
MPIDI_CH3_Init(43)........:
MPID_nem_init(202)........:
MPIDI_CH3I_Seg_commit(363): PMI_KVS_Get returned -1
Killed (signal 9)
[helires at bigblue ~/CFD_ADM]$ mpiexec  --verbose -np 1 /wrk3/helires/CFD_ADM/CouplingScript3D_simple.marc -jid mod4_rotor_adm -dirjid /wrk3/helires/CFD_ADM -maxnum 1000000 -nthread 1 -dirjob /wrk3/helires/CFD_ADM -ml 5000 -ci yes -cr yes : -np 1  /wrk3/helires/bin/Python-2.7_install/bin/python2.7-mpi /wrk3/helires/CFD_ADM/MasterMarc3D_simple.py
[mpiexec at bigblue]
[mpiexec at bigblue] =================================================[mpiexec at bigblue] =================================================[mpiexec at bigblue]
[mpiexec at bigblue] mpiexec options:
[mpiexec at bigblue] ----------------
[mpiexec at bigblue]   Base path: /wrk3/helires/bin/mpich2-install/bin/
[mpiexec at bigblue]   Proxy port: 9899
[mpiexec at bigblue]   Bootstrap server: ssh
[mpiexec at bigblue]   Debug level: 1
[mpiexec at bigblue]   Enable X: -1
[mpiexec at bigblue]   Working dir: /wrk3/helires/CFD_ADM
[mpiexec at bigblue]   Host file: HYDRA_USE_LOCALHOST
[mpiexec at bigblue]
[mpiexec at bigblue]   Global environment:
[mpiexec at bigblue]   -------------------
[mpiexec at bigblue]     REMOTEHOST=nanopus.frlab
[mpiexec at bigblue]     HOSTNAME=bigblue
[mpiexec at bigblue]     MSC_LICENSE_NOQUEUE=yes
[mpiexec at bigblue]     HOST=bigblue
[mpiexec at bigblue]     TERM=dtterm
[mpiexec at bigblue]     SHELL=/bin/csh
[mpiexec at bigblue]     SSH_CLIENT=125.1.5.218 62405 22
[mpiexec at bigblue]     MSC_LICENCE_FILE=1700 at adriatic
[mpiexec at bigblue]     QTDIR=/usr/lib64/qt-3.3
[mpiexec at bigblue]     QTINC=/usr/lib64/qt-3.3/include
[mpiexec at bigblue]     SSH_TTY=/dev/pts/4
[mpiexec at bigblue]     GROUP=DADS
[mpiexec at bigblue]     USER=helires
[mpiexec at bigblue]     LD_LIBRARY_PATH=/wrk3/helires/bin/mpich2-install/lib:/opt/intel/cce/11.1/073/lib/intel64:/opt/intel/fce/10.1.026/lib:/wrk3/helires/bin/marc/marc2010.2/intelmpi/linux64/lib64/:/wrk3/helires/bin/marc/marc2010.2/lib/linux64i8:/wrk3/helires/bin/marc/marc2010.2/lib_shared/linux64:/wrk3/helires/bin/Python-2.7_install/lib/python2.7:/lib64:/usr/lib64/:
[mpiexec at bigblue]     LS_COLORS=no
[mpiexec at bigblue]     HOSTTYPE=x86_64-linux
[mpiexec at bigblue]     KDEDIR=/usr
[mpiexec at bigblue]     MAIL=/var/spool/mail/helires
[mpiexec at bigblue]     PATH=.:/wrk3/helires/bin/mpich2-install/bin:/wrk3/helires/bin/Python-2.7_install/bin:/wrk3/helires/bin/mpich2-install/bin:/wrk3/helires/bin/util:/usr/lib64/qt-3.3/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin
[mpiexec at bigblue]     INPUTRC=/etc/inputrc
[mpiexec at bigblue]     PWD=/wrk3/helires/CFD_ADM
[mpiexec at bigblue]     LANG=fr_FR.UTF-8
[mpiexec at bigblue]     KDE_IS_PRELINKED=1
[mpiexec at bigblue]     PS1=`hostname`>>
[mpiexec at bigblue]     LM_LICENSE_FILE=1700 at adriatic
[mpiexec at bigblue]     SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass
[mpiexec at bigblue]     SHLVL=3
[mpiexec at bigblue]     HOME=/wrk3/helires
[mpiexec at bigblue]     OSTYPE=linux
[mpiexec at bigblue]     CFLAGS=-fPIC
[mpiexec at bigblue]     VENDOR=unknown
[mpiexec at bigblue]     PYTHONPATH=/wrk3/helires/bin/Python-2.7_install/lib/python2.7:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/plat-linux2:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/lib-tk:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/lib-dynload:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/site-packages
[mpiexec at bigblue]     MACHTYPE=x86_64
[mpiexec at bigblue]     LOGNAME=helires
[mpiexec at bigblue]     QTLIB=/usr/lib64/qt-3.3/lib
[mpiexec at bigblue]     SSH_CONNECTION=125.1.5.218 62405 125.1.6.45 22
[mpiexec at bigblue]     LESSOPEN=|/usr/bin/lesspipe.sh %s
[mpiexec at bigblue]     DISPLAY=hudson:0
[mpiexec at bigblue]     G_BROKEN_FILENAMES=1
[mpiexec at bigblue]     OLDPWD=/wrk3/helires
[mpiexec at bigblue]     _=/bin/csh
[mpiexec at bigblue]

[mpiexec at bigblue]     Executable information:
[mpiexec at bigblue]     **********************
[mpiexec at bigblue]       Executable ID:  1
[mpiexec at bigblue]       -----------------
[mpiexec at bigblue]         Process count: 1
[mpiexec at bigblue]         Executable: /wrk3/helires/CFD_ADM/CouplingScript3D_simple.marc -jid mod4_rotor_adm -dirjid /wrk3/helires/CFD_ADM -maxnum 1000000 -nthread 1 -dirjob /wrk3/helires/CFD_ADM -ml 5000 -ci yes -cr yes
[mpiexec at bigblue]
[mpiexec at bigblue]       Executable ID:  2
[mpiexec at bigblue]       -----------------
[mpiexec at bigblue]         Process count: 1
[mpiexec at bigblue]         Executable: /wrk3/helires/bin/Python-2.7_install/bin/python2.7-mpi /wrk3/helires/CFD_ADM/MasterMarc3D_simple.py
[mpiexec at bigblue]
[mpiexec at bigblue]     Partition information:
[mpiexec at bigblue]     *********************
[mpiexec at bigblue]       Partition ID:  1
[mpiexec at bigblue]       -----------------
[mpiexec at bigblue]         Partition name: localhost
[mpiexec at bigblue]         Process count: 1
[mpiexec at bigblue]
[mpiexec at bigblue]         Partition segment list:
[mpiexec at bigblue]         .......................
[mpiexec at bigblue]           Start PID: 0; Process count: 1
[mpiexec at bigblue]
[mpiexec at bigblue]         Partition exec list:
[mpiexec at bigblue]         ....................
[mpiexec at bigblue]           Exec: /wrk3/helires/CFD_ADM/CouplingScript3D_simple.marc; Process count: 1
[mpiexec at bigblue]           Exec: /wrk3/helires/bin/Python-2.7_install/bin/python2.7-mpi; Process count: 1
[mpiexec at bigblue]
[mpiexec at bigblue] =================================================[mpiexec at bigblue] =================================================[mpiexec at bigblue]

[mpiexec at bigblue] Timeout set to -1 (-1 means infinite)
[mpiexec at bigblue] Got a PMI port string of bigblue:33253
[mpiexec at bigblue] Got a proxy port string of bigblue:55923
Arguments being passed to proxy 0:
--global-core-count 1 --wdir /wrk3/helires/CFD_ADM --pmi-port-str bigblue:33253 --binding HYDRA_NULL HYDRA_NULL --bindlib plpa --ckpointlib none --ckpoint-prefix HYDRA_NULL --global-inherited-env 41 'REMOTEHOST=nanopus.frlab' 'HOSTNAME=bigblue' 'MSC_LICENSE_NOQUEUE=yes' 'HOST=bigblue' 'TERM=dtterm' 'SHELL=/bin/csh' 'SSH_CLIENT=125.1.5.218 62405 22' 'MSC_LICENCE_FILE=1700 at adriatic' 'QTDIR=/usr/lib64/qt-3.3' 'QTINC=/usr/lib64/qt-3.3/include' 'SSH_TTY=/dev/pts/4' 'GROUP=DADS' 'USER=helires' 'LD_LIBRARY_PATH=/wrk3/helires/bin/mpich2-install/lib:/opt/intel/cce/11.1/073/lib/intel64:/opt/intel/fce/10.1.026/lib:/wrk3/helires/bin/marc/marc2010.2/intelmpi/linux64/lib64/:/wrk3/helires/bin/marc/marc2010.2/lib/linux64i8:/wrk3/helires/bin/marc/marc2010.2/lib_shared/linux64:/wrk3/helires/bin/Python-2.7_install/lib/python2.7:/lib64:/usr/lib64/:' 'LS_COLORS=no' 'HOSTTYPE=x86_64-linux' 'KDEDIR=/usr' 'MAIL=/var/spool/mail/helires'
 'PATH=.:/wrk3/helires/bin/mpich2-install/bin:/wrk3/helires/bin/Python-2.7_install/bin:/wrk3/helires/bin/mpich2-install/bin:/wrk3/helires/bin/util:/usr/lib64/qt-3.3/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin' 'INPUTRC=/etc/inputrc' 'PWD=/wrk3/helires/CFD_ADM' 'LANG=fr_FR.UTF-8' 'KDE_IS_PRELINKED=1' 'PS1=`hostname`>> ' 'LM_LICENSE_FILE=1700 at adriatic' 'SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass' 'SHLVL=3' 'HOME=/wrk3/helires' 'OSTYPE=linux' 'CFLAGS=-fPIC' 'VENDOR=unknown' 'PYTHONPATH=/wrk3/helires/bin/Python-2.7_install/lib/python2.7:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/plat-linux2:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/lib-tk:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/lib-dynload:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/site-packages' 'MACHTYPE=x86_64' 'LOGNAME=helires' 'QTLIB=/usr/lib64/qt-3.3/lib' 'SSH_CONNECTION=125.1.5.218 62405 125.1.6.45 22' 'LESSOPEN=|/usr/bin/lesspipe.sh
 %s' 'DISPLAY=hudson:0' 'G_BROKEN_FILENAMES=1' 'OLDPWD=/wrk3/helires' '_=/bin/csh' --global-user-env 0 --global-system-env 0 --genv-prop 1 --segment --segment-start-pid 0 --segment-proc-count 1 --exec --exec-proc-count 1 --exec-local-env 0 --exec-env-prop 0 /wrk3/helires/CFD_ADM/CouplingScript3D_simple.marc -jid mod4_rotor_adm -dirjid /wrk3/helires/CFD_ADM -maxnum 1000000 -nthread 1 -dirjob /wrk3/helires/CFD_ADM -ml 5000 -ci yes -cr yes --exec --exec-proc-count 1 --exec-local-env 0 --exec-env-prop 0 /wrk3/helires/bin/Python-2.7_install/bin/python2.7-mpi /wrk3/helires/CFD_ADM/MasterMarc3D_simple.py

[mpiexec at bigblue] Launching process: /usr/bin/ssh -x localhost /wrk3/helires/bin/mpich2-install/bin/pmi_proxy --launch-mode 1 --proxy-port bigblue:55923 --debug --bootstrap ssh --partition-id 0
helires at localhost's password:
Marc mod4_rotor_adm begins execution

     (c) COPYRIGHT 2011 MSC.Software Corporation, all rights reserved                                                                                               


VERSION: Marc,  Version, Build, Date                                                                                                                                     



     Date: Fri Jul 22 15:02:39 2011

                              Marc execution begins
Date:          Fri Jul 22 15:02:39 2011
MSC Id:        0017a4770030 (ethernet) (Linux)
Hostname:      bigblue (user helires, display )
License files: 1700 at adriatic
CEID:          77F66039-BC7GFC45
User:          helires
Display:       
LAPI Version:  LAPI 8.3.1-2041 (FLEXlm 10.8.6.0)
Acquired 160 licenses for Group CAMPUS (Marc) from license server on host adriatic


             general memory initially set to =        25 MByte

             maximum available memory set to =      5000 MByte

             general memory increasing from      25 MByte to     106 MByte

   MSC Customer Entitlement ID
        77F66039-BC7GFC45

             wall time =           2.60

             wall time =           3.67

             general memory increasing from     106 MByte to     552 MByte
Appel SB UBGINC!
flag= F ;ierr=                     0
Fatal error in MPI_Init_thread: Other MPI error, error stack:
MPIR_Init_thread(394).....: Initialization failed
MPID_Init(118)............: channel initialization failed
MPIDI_CH3_Init(43)........:
MPID_nem_init(202)........:
MPIDI_CH3I_Seg_commit(363): PMI_KVS_Get returned -1
Killed (signal 9)


More information about the mpich-discuss mailing list