CTM_PROGNAME=CCTM_e3a_Linux2_i686 ls -l /mnt/vg01/CMAQ/scripts/cctm/CCTM_e3a_Linux2_i686 -rwxrwx--- 1 andy_holland users 8535467 Mar 25 05:23 /mnt/vg01/CMAQ/scripts/cctm/CCTM_e3a_Linux2_i686 size /mnt/vg01/CMAQ/scripts/cctm/CCTM_e3a_Linux2_i686 text data bss dec hex filename 3324059 4122200 4489660 11935919 b620af /mnt/vg01/CMAQ/scripts/cctm/CCTM_e3a_Linux2_i686 set MPIRUN = /usr/local/mpich2/bin/mpirun set TASKMAP = /mnt/vg01/CMAQ/scripts/cctm/machines8 cat /mnt/vg01/CMAQ/scripts/cctm/machines8 s051rhlapp01:2 s051rhlapp02:2 /usr/local/mpich2/bin/mpirun -v -machinefile /mnt/vg01/CMAQ/scripts/cctm/machines8 -np 4 /mnt/vg01/CMAQ/scripts/cctm/CCTM_e3a_Linux2_i686 ================================================================================================== mpiexec options: ---------------- Base path: /usr/local/mpich2/bin/ Launcher: (null) Debug level: 1 Enable X: -1 Global environment: ------------------- USER=andy_holland LOGNAME=andy_holland HOME=/home/andy_holland PATH=/opt/intel/idb/10.0.023/bin:/opt/intel/fc/10.0.023/bin:/usr/local/ncarg/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/vis_tools/pave:.:/usr/local/lib/ioapi-3.1/Linux2_x86_64ifort:/usr/local/IDV_2.3:/vis_tools/verdi_1.31_beta:/usr/local/mpich2/bin:/models/mims/spatial MAIL=/var/spool/mail/andy_holland SHELL=/bin/tcsh SSH_CLIENT=10.202.70.73 3157 22 SSH_CONNECTION=10.202.70.73 3157 10.51.10.40 22 SSH_TTY=/dev/pts/2 TERM=vt100 DISPLAY=localhost:10.0 SSH_AUTH_SOCK=/tmp/ssh-ATmHoz3419/agent.3419 HOSTTYPE=i386-linux VENDOR=intel OSTYPE=linux MACHTYPE=i386 SHLVL=2 PWD=/mnt/vg01/CMAQ/scripts/cctm GROUP=users HOST=s051rhlapp01 REMOTEHOST=10.202.70.73 LS_COLORS=no G_BROKEN_FILENAMES=1 SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass LANG=en_US.UTF-8 LESSOPEN=|/usr/bin/lesspipe.sh %s HOSTNAME=s051rhlapp01 INPUTRC=/etc/inputrc EDSS_BINDIR=/vis_tools/pave/Linux2_x86/bin/OPTIMIZE PAVE_COORD=2 33 45 -97 -97 40 F_UFMTENDIAN=big NCARG_ROOT=/usr/local/ncarg RIP_ROOT=/models/MM5V3/RIP SMK_HOME=/models/smoke SMOKE_EXE=Linux2_x86ifc PROJ_LIB=/models/mims/spatial/src/PROJ4.5/local/share/proj LD_LIBRARY_PATH=/opt/intel/fc/10.0.023/lib DYLD_LIBRARY_PATH=/opt/intel/fc/10.0.023/lib MANPATH=/opt/intel/idb/10.0.023/man:/opt/intel/fc/10.0.023/man:/opt/intel/fc/10.0.023/man:/usr/local/ncarg/man:/usr/kerberos/man:/usr/local/share/man:/usr/share/man/en:/usr/share/man:/usr/X11R6/man:/usr/man INTEL_LICENSE_FILE=/opt/intel/fc/10.0.023/licenses:/opt/intel/licenses:/home/andy_holland/intel/licenses:/Users/Shared/Library/Application Support/Intel/Licenses M3HOME=/mnt/vg01/CMAQ M3DATA=/mnt/vg01/CMAQ/data M3MODEL=/mnt/vg01/CMAQ/models M3LIB=/mnt/vg01/CMAQ/lib M3TOOLS=/mnt/vg01/CMAQ/tools NPCOL_NPROW=2 2 IOAPI_LOG_WRITE=F FL_ERR_STOP=F CTM_APPL=benchmark FLOOR_FILE=/mnt/vg01/CMAQ/scripts/cctm/FLOOR_benchmark GRIDDESC=../GRIDDESC1 GRID_NAME=M_36_2001 AVG_CONC_SPCS=O3 NO CO NO2 ASO4I ASO4J NH3 ACONC_BLEV_ELEV= 1 1 DEPV_TRAC_1=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/METCRO2D_010722 OCEAN_1=/mnt/vg01/CMAQ/data/emis/2001/us36_surf.40x44.ncf EMIS_1=/mnt/vg01/CMAQ/data/emis/2001/emis3d.20010722.US36_40X44.ncf INIT_GASC_1=/mnt/vg01/CMAQ/data/icon/ICON_cb05cl_M_36_2001_profile BNDY_GASC_1=/mnt/vg01/CMAQ/data/bcon/BCON_cb05cl_M_36_2001_profile INIT_AERO_1=/mnt/vg01/CMAQ/data/icon/ICON_cb05cl_M_36_2001_profile BNDY_AERO_1=/mnt/vg01/CMAQ/data/bcon/BCON_cb05cl_M_36_2001_profile INIT_NONR_1=/mnt/vg01/CMAQ/data/icon/ICON_cb05cl_M_36_2001_profile BNDY_NONR_1=/mnt/vg01/CMAQ/data/bcon/BCON_cb05cl_M_36_2001_profile INIT_TRAC_1=/mnt/vg01/CMAQ/data/icon/ICON_cb05cl_M_36_2001_profile BNDY_TRAC_1=/mnt/vg01/CMAQ/data/bcon/BCON_cb05cl_M_36_2001_profile GRID_DOT_2D=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/GRIDDOT2D_010722 GRID_CRO_2D=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/GRIDCRO2D_010722 MET_CRO_2D=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/METCRO2D_010722 MET_DOT_3D=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/METDOT3D_010722 MET_CRO_3D=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/METCRO3D_010722 MET_BDY_3D=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/METBDY3D_010722 LAYER_FILE=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/METCRO3D_010722 XJ_DATA=/mnt/vg01/CMAQ/data/jproc/JTABLE_2001203 CTM_CONC_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.CONC.2001203.ncf -v A_CONC_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.ACONC.2001203.ncf -v S_CGRID=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.CGRID.2001203.ncf -v CTM_DRY_DEP_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.DRYDEP.2001203.ncf -v CTM_WET_DEP_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.WETDEP1.2001203.ncf -v CTM_WET_DEP_2=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.WETDEP2.2001203.ncf -v CTM_SSEMIS_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.SSEMIS1.2001203.ncf -v CTM_VIS_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.AEROVIS.2001203.ncf -v CTM_DIAM_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.AERODIAM.2001203.ncf -v CTM_IPR_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.PA_1.2001203.ncf -v CTM_IPR_2=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.PA_2.2001203.ncf -v CTM_IPR_3=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.PA_3.2001203.ncf -v CTM_IRR_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.IRR_1.2001203.ncf -v CTM_IRR_2=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.IRR_2.2001203.ncf -v CTM_IRR_3=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.IRR_3.2001203.ncf -v CTM_RJ_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.RJ_1.2001203.ncf -v CTM_RJ_2=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.RJ_2.2001203.ncf -v CTM_STDATE=2001203 CTM_STTIME=000000 CTM_RUNLEN=240000 CTM_TSTEP=010000 CTM_PROGNAME=CCTM_e3a_Linux2_i686 Hydra internal environment: --------------------------- GFORTRAN_UNBUFFERED_PRECONNECTED=y Proxy information: ********************* Proxy ID: 1 ----------------- Proxy name: s051rhlapp01 Process count: 2 Start PID: 0 Proxy exec list: .................... Exec: /mnt/vg01/CMAQ/scripts/cctm/CCTM_e3a_Linux2_i686; Process count: 2 Proxy ID: 2 ----------------- Proxy name: s051rhlapp02 Process count: 2 Start PID: 2 Proxy exec list: .................... Exec: /mnt/vg01/CMAQ/scripts/cctm/CCTM_e3a_Linux2_i686; Process count: 2 ================================================================================================== [mpiexec@s051rhlapp01] Timeout set to -1 (-1 means infinite) [mpiexec@s051rhlapp01] Got a control port string of s051rhlapp01:53313 Proxy launch args: /usr/local/mpich2/bin/hydra_pmi_proxy --control-port s051rhlapp01:53313 --debug --demux poll --pgid 0 --proxy-id [mpiexec@s051rhlapp01] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 0: --version 1.3.2p1 --interface-env-name MPICH_INTERFACE_HOSTNAME --hostname s051rhlapp01 --global-core-count 4 --global-process-count 4 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_4180_0 --pmi-process-mapping (vector,(0,2,2)) --ckpoint-num -1 --global-inherited-env 95 'USER=andy_holland' 'LOGNAME=andy_holland' 'HOME=/home/andy_holland' 'PATH=/opt/intel/idb/10.0.023/bin:/opt/intel/fc/10.0.023/bin:/usr/local/ncarg/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/vis_tools/pave:.:/usr/local/lib/ioapi-3.1/Linux2_x86_64ifort:/usr/local/IDV_2.3:/vis_tools/verdi_1.31_beta:/usr/local/mpich2/bin:/models/mims/spatial' 'MAIL=/var/spool/mail/andy_holland' 'SHELL=/bin/tcsh' 'SSH_CLIENT=10.202.70.73 3157 22' 'SSH_CONNECTION=10.202.70.73 3157 10.51.10.40 22' 'SSH_TTY=/dev/pts/2' 'TERM=vt100' 'DISPLAY=localhost:10.0' 'SSH_AUTH_SOCK=/tmp/ssh-ATmHoz3419/agent.3419' 'HOSTTYPE=i386-linux' 'VENDOR=intel' 'OSTYPE=linux' 'MACHTYPE=i386' 'SHLVL=2' 'PWD=/mnt/vg01/CMAQ/scripts/cctm' 'GROUP=users' 'HOST=s051rhlapp01' 'REMOTEHOST=10.202.70.73' 'LS_COLORS=no' 'G_BROKEN_FILENAMES=1' 'SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass' 'LANG=en_US.UTF-8' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'HOSTNAME=s051rhlapp01' 'INPUTRC=/etc/inputrc' 'EDSS_BINDIR=/vis_tools/pave/Linux2_x86/bin/OPTIMIZE' 'PAVE_COORD=2 33 45 -97 -97 40' 'F_UFMTENDIAN=big' 'NCARG_ROOT=/usr/local/ncarg' 'RIP_ROOT=/models/MM5V3/RIP' 'SMK_HOME=/models/smoke' 'SMOKE_EXE=Linux2_x86ifc' 'PROJ_LIB=/models/mims/spatial/src/PROJ4.5/local/share/proj' 'LD_LIBRARY_PATH=/opt/intel/fc/10.0.023/lib' 'DYLD_LIBRARY_PATH=/opt/intel/fc/10.0.023/lib' 'MANPATH=/opt/intel/idb/10.0.023/man:/opt/intel/fc/10.0.023/man:/opt/intel/fc/10.0.023/man:/usr/local/ncarg/man:/usr/kerberos/man:/usr/local/share/man:/usr/share/man/en:/usr/share/man:/usr/X11R6/man:/usr/man' 'INTEL_LICENSE_FILE=/opt/intel/fc/10.0.023/licenses:/opt/intel/licenses:/home/andy_holland/intel/licenses:/Users/Shared/Library/Application Support/Intel/Licenses' 'M3HOME=/mnt/vg01/CMAQ' 'M3DATA=/mnt/vg01/CMAQ/data' 'M3MODEL=/mnt/vg01/CMAQ/models' 'M3LIB=/mnt/vg01/CMAQ/lib' 'M3TOOLS=/mnt/vg01/CMAQ/tools' 'NPCOL_NPROW=2 2' 'IOAPI_LOG_WRITE=F' 'FL_ERR_STOP=F' 'CTM_APPL=benchmark' 'FLOOR_FILE=/mnt/vg01/CMAQ/scripts/cctm/FLOOR_benchmark' 'GRIDDESC=../GRIDDESC1' 'GRID_NAME=M_36_2001' 'AVG_CONC_SPCS=O3 NO CO NO2 ASO4I ASO4J NH3' 'ACONC_BLEV_ELEV= 1 1' 'DEPV_TRAC_1=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/METCRO2D_010722' 'OCEAN_1=/mnt/vg01/CMAQ/data/emis/2001/us36_surf.40x44.ncf' 'EMIS_1=/mnt/vg01/CMAQ/data/emis/2001/emis3d.20010722.US36_40X44.ncf' 'INIT_GASC_1=/mnt/vg01/CMAQ/data/icon/ICON_cb05cl_M_36_2001_profile' 'BNDY_GASC_1=/mnt/vg01/CMAQ/data/bcon/BCON_cb05cl_M_36_2001_profile' 'INIT_AERO_1=/mnt/vg01/CMAQ/data/icon/ICON_cb05cl_M_36_2001_profile' 'BNDY_AERO_1=/mnt/vg01/CMAQ/data/bcon/BCON_cb05cl_M_36_2001_profile' 'INIT_NONR_1=/mnt/vg01/CMAQ/data/icon/ICON_cb05cl_M_36_2001_profile' 'BNDY_NONR_1=/mnt/vg01/CMAQ/data/bcon/BCON_cb05cl_M_36_2001_profile' 'INIT_TRAC_1=/mnt/vg01/CMAQ/data/icon/ICON_cb05cl_M_36_2001_profile' 'BNDY_TRAC_1=/mnt/vg01/CMAQ/data/bcon/BCON_cb05cl_M_36_2001_profile' 'GRID_DOT_2D=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/GRIDDOT2D_010722' 'GRID_CRO_2D=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/GRIDCRO2D_010722' 'MET_CRO_2D=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/METCRO2D_010722' 'MET_DOT_3D=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/METDOT3D_010722' 'MET_CRO_3D=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/METCRO3D_010722' 'MET_BDY_3D=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/METBDY3D_010722' 'LAYER_FILE=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/METCRO3D_010722' 'XJ_DATA=/mnt/vg01/CMAQ/data/jproc/JTABLE_2001203' 'CTM_CONC_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.CONC.2001203.ncf -v' 'A_CONC_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.ACONC.2001203.ncf -v' 'S_CGRID=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.CGRID.2001203.ncf -v' 'CTM_DRY_DEP_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.DRYDEP.2001203.ncf -v' 'CTM_WET_DEP_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.WETDEP1.2001203.ncf -v' 'CTM_WET_DEP_2=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.WETDEP2.2001203.ncf -v' 'CTM_SSEMIS_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.SSEMIS1.2001203.ncf -v' 'CTM_VIS_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.AEROVIS.2001203.ncf -v' 'CTM_DIAM_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.AERODIAM.2001203.ncf -v' 'CTM_IPR_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.PA_1.2001203.ncf -v' 'CTM_IPR_2=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.PA_2.2001203.ncf -v' 'CTM_IPR_3=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.PA_3.2001203.ncf -v' 'CTM_IRR_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.IRR_1.2001203.ncf -v' 'CTM_IRR_2=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.IRR_2.2001203.ncf -v' 'CTM_IRR_3=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.IRR_3.2001203.ncf -v' 'CTM_RJ_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.RJ_1.2001203.ncf -v' 'CTM_RJ_2=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.RJ_2.2001203.ncf -v' 'CTM_STDATE=2001203' 'CTM_STTIME=000000' 'CTM_RUNLEN=240000' 'CTM_TSTEP=010000' 'CTM_PROGNAME=CCTM_e3a_Linux2_i686' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --start-pid 0 --proxy-core-count 2 --exec --exec-appnum 0 --exec-proc-count 2 --exec-local-env 0 --exec-wdir /mnt/vg01/CMAQ/scripts/cctm --exec-args 1 /mnt/vg01/CMAQ/scripts/cctm/CCTM_e3a_Linux2_i686 [mpiexec@s051rhlapp01] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 1: --version 1.3.2p1 --interface-env-name MPICH_INTERFACE_HOSTNAME --hostname s051rhlapp02 --global-core-count 4 --global-process-count 4 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_4180_0 --pmi-process-mapping (vector,(0,2,2)) --ckpoint-num -1 --global-inherited-env 95 'USER=andy_holland' 'LOGNAME=andy_holland' 'HOME=/home/andy_holland' 'PATH=/opt/intel/idb/10.0.023/bin:/opt/intel/fc/10.0.023/bin:/usr/local/ncarg/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/vis_tools/pave:.:/usr/local/lib/ioapi-3.1/Linux2_x86_64ifort:/usr/local/IDV_2.3:/vis_tools/verdi_1.31_beta:/usr/local/mpich2/bin:/models/mims/spatial' 'MAIL=/var/spool/mail/andy_holland' 'SHELL=/bin/tcsh' 'SSH_CLIENT=10.202.70.73 3157 22' 'SSH_CONNECTION=10.202.70.73 3157 10.51.10.40 22' 'SSH_TTY=/dev/pts/2' 'TERM=vt100' 'DISPLAY=localhost:10.0' 'SSH_AUTH_SOCK=/tmp/ssh-ATmHoz3419/agent.3419' 'HOSTTYPE=i386-linux' 'VENDOR=intel' 'OSTYPE=linux' 'MACHTYPE=i386' 'SHLVL=2' 'PWD=/mnt/vg01/CMAQ/scripts/cctm' 'GROUP=users' 'HOST=s051rhlapp01' 'REMOTEHOST=10.202.70.73' 'LS_COLORS=no' 'G_BROKEN_FILENAMES=1' 'SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass' 'LANG=en_US.UTF-8' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'HOSTNAME=s051rhlapp01' 'INPUTRC=/etc/inputrc' 'EDSS_BINDIR=/vis_tools/pave/Linux2_x86/bin/OPTIMIZE' 'PAVE_COORD=2 33 45 -97 -97 40' 'F_UFMTENDIAN=big' 'NCARG_ROOT=/usr/local/ncarg' 'RIP_ROOT=/models/MM5V3/RIP' 'SMK_HOME=/models/smoke' 'SMOKE_EXE=Linux2_x86ifc' 'PROJ_LIB=/models/mims/spatial/src/PROJ4.5/local/share/proj' 'LD_LIBRARY_PATH=/opt/intel/fc/10.0.023/lib' 'DYLD_LIBRARY_PATH=/opt/intel/fc/10.0.023/lib' 'MANPATH=/opt/intel/idb/10.0.023/man:/opt/intel/fc/10.0.023/man:/opt/intel/fc/10.0.023/man:/usr/local/ncarg/man:/usr/kerberos/man:/usr/local/share/man:/usr/share/man/en:/usr/share/man:/usr/X11R6/man:/usr/man' 'INTEL_LICENSE_FILE=/opt/intel/fc/10.0.023/licenses:/opt/intel/licenses:/home/andy_holland/intel/licenses:/Users/Shared/Library/Application Support/Intel/Licenses' 'M3HOME=/mnt/vg01/CMAQ' 'M3DATA=/mnt/vg01/CMAQ/data' 'M3MODEL=/mnt/vg01/CMAQ/models' 'M3LIB=/mnt/vg01/CMAQ/lib' 'M3TOOLS=/mnt/vg01/CMAQ/tools' 'NPCOL_NPROW=2 2' 'IOAPI_LOG_WRITE=F' 'FL_ERR_STOP=F' 'CTM_APPL=benchmark' 'FLOOR_FILE=/mnt/vg01/CMAQ/scripts/cctm/FLOOR_benchmark' 'GRIDDESC=../GRIDDESC1' 'GRID_NAME=M_36_2001' 'AVG_CONC_SPCS=O3 NO CO NO2 ASO4I ASO4J NH3' 'ACONC_BLEV_ELEV= 1 1' 'DEPV_TRAC_1=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/METCRO2D_010722' 'OCEAN_1=/mnt/vg01/CMAQ/data/emis/2001/us36_surf.40x44.ncf' 'EMIS_1=/mnt/vg01/CMAQ/data/emis/2001/emis3d.20010722.US36_40X44.ncf' 'INIT_GASC_1=/mnt/vg01/CMAQ/data/icon/ICON_cb05cl_M_36_2001_profile' 'BNDY_GASC_1=/mnt/vg01/CMAQ/data/bcon/BCON_cb05cl_M_36_2001_profile' 'INIT_AERO_1=/mnt/vg01/CMAQ/data/icon/ICON_cb05cl_M_36_2001_profile' 'BNDY_AERO_1=/mnt/vg01/CMAQ/data/bcon/BCON_cb05cl_M_36_2001_profile' 'INIT_NONR_1=/mnt/vg01/CMAQ/data/icon/ICON_cb05cl_M_36_2001_profile' 'BNDY_NONR_1=/mnt/vg01/CMAQ/data/bcon/BCON_cb05cl_M_36_2001_profile' 'INIT_TRAC_1=/mnt/vg01/CMAQ/data/icon/ICON_cb05cl_M_36_2001_profile' 'BNDY_TRAC_1=/mnt/vg01/CMAQ/data/bcon/BCON_cb05cl_M_36_2001_profile' 'GRID_DOT_2D=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/GRIDDOT2D_010722' 'GRID_CRO_2D=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/GRIDCRO2D_010722' 'MET_CRO_2D=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/METCRO2D_010722' 'MET_DOT_3D=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/METDOT3D_010722' 'MET_CRO_3D=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/METCRO3D_010722' 'MET_BDY_3D=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/METBDY3D_010722' 'LAYER_FILE=/mnt/vg01/CMAQ/data/mcip3/M_36_2001/METCRO3D_010722' 'XJ_DATA=/mnt/vg01/CMAQ/data/jproc/JTABLE_2001203' 'CTM_CONC_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.CONC.2001203.ncf -v' 'A_CONC_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.ACONC.2001203.ncf -v' 'S_CGRID=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.CGRID.2001203.ncf -v' 'CTM_DRY_DEP_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.DRYDEP.2001203.ncf -v' 'CTM_WET_DEP_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.WETDEP1.2001203.ncf -v' 'CTM_WET_DEP_2=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.WETDEP2.2001203.ncf -v' 'CTM_SSEMIS_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.SSEMIS1.2001203.ncf -v' 'CTM_VIS_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.AEROVIS.2001203.ncf -v' 'CTM_DIAM_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.AERODIAM.2001203.ncf -v' 'CTM_IPR_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.PA_1.2001203.ncf -v' 'CTM_IPR_2=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.PA_2.2001203.ncf -v' 'CTM_IPR_3=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.PA_3.2001203.ncf -v' 'CTM_IRR_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.IRR_1.2001203.ncf -v' 'CTM_IRR_2=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.IRR_2.2001203.ncf -v' 'CTM_IRR_3=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.IRR_3.2001203.ncf -v' 'CTM_RJ_1=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.RJ_1.2001203.ncf -v' 'CTM_RJ_2=/mnt/vg01/CMAQ/data/cctm/CCTM_e3a_Linux2_i686.benchmark.RJ_2.2001203.ncf -v' 'CTM_STDATE=2001203' 'CTM_STTIME=000000' 'CTM_RUNLEN=240000' 'CTM_TSTEP=010000' 'CTM_PROGNAME=CCTM_e3a_Linux2_i686' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --start-pid 2 --proxy-core-count 2 --exec --exec-appnum 0 --exec-proc-count 2 --exec-local-env 0 --exec-wdir /mnt/vg01/CMAQ/scripts/cctm --exec-args 1 /mnt/vg01/CMAQ/scripts/cctm/CCTM_e3a_Linux2_i686 [mpiexec@s051rhlapp01] Launch arguments: /usr/local/mpich2/bin/hydra_pmi_proxy --control-port s051rhlapp01:53313 --debug --demux poll --pgid 0 --proxy-id 0 [mpiexec@s051rhlapp01] Launch arguments: /usr/bin/ssh -x s051rhlapp02 "/usr/local/mpich2/bin/hydra_pmi_proxy" --control-port s051rhlapp01:53313 --debug --demux poll --pgid 0 --proxy-id 1 [proxy:0:0@s051rhlapp01] got pmi command (from 0): init pmi_version=1 pmi_subversion=1 [proxy:0:0@s051rhlapp01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@s051rhlapp01] got pmi command (from 0): get_maxes [proxy:0:0@s051rhlapp01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@s051rhlapp01] got pmi command (from 0): get_appnum [proxy:0:0@s051rhlapp01] PMI response: cmd=appnum appnum=0 [proxy:0:0@s051rhlapp01] got pmi command (from 0): get_my_kvsname [proxy:0:0@s051rhlapp01] PMI response: cmd=my_kvsname kvsname=kvs_4180_0 [proxy:0:0@s051rhlapp01] got pmi command (from 0): get_my_kvsname [proxy:0:0@s051rhlapp01] PMI response: cmd=my_kvsname kvsname=kvs_4180_0 [proxy:0:0@s051rhlapp01] got pmi command (from 0): get kvsname=kvs_4180_0 key=PMI_process_mapping [proxy:0:0@s051rhlapp01] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,2,2)) [proxy:0:0@s051rhlapp01] got pmi command (from 0): put kvsname=kvs_4180_0 key=sharedFilename[0] value=/dev/shm/mpich_shar_tmptwaYml [proxy:0:0@s051rhlapp01] we don't understand this command put; forwarding upstream [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4180_0 key=sharedFilename[0] value=/dev/shm/mpich_shar_tmptwaYml [mpiexec@s051rhlapp01] PMI response to fd 6 pid 0: cmd=put_result rc=0 msg=success [proxy:0:0@s051rhlapp01] we don't understand the response put_result; forwarding downstream [proxy:0:0@s051rhlapp01] got pmi command (from 0): barrier_in [proxy:0:0@s051rhlapp01] got pmi command (from 6): init pmi_version=1 pmi_subversion=1 [proxy:0:0@s051rhlapp01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@s051rhlapp01] got pmi command (from 6): get_maxes [proxy:0:0@s051rhlapp01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@s051rhlapp01] got pmi command (from 6): get_appnum [proxy:0:0@s051rhlapp01] PMI response: cmd=appnum appnum=0 [proxy:0:0@s051rhlapp01] got pmi command (from 6): get_my_kvsname [proxy:0:0@s051rhlapp01] PMI response: cmd=my_kvsname kvsname=kvs_4180_0 [proxy:0:0@s051rhlapp01] got pmi command (from 6): get_my_kvsname [proxy:0:0@s051rhlapp01] PMI response: cmd=my_kvsname kvsname=kvs_4180_0 [proxy:0:0@s051rhlapp01] got pmi command (from 6): get kvsname=kvs_4180_0 key=PMI_process_mapping [proxy:0:0@s051rhlapp01] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,2,2)) [proxy:0:0@s051rhlapp01] got pmi command (from 6): barrier_in [proxy:0:0@s051rhlapp01] forwarding command (cmd=barrier_in) upstream [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=barrier_in [proxy:0:1@s051rhlapp02] got pmi command (from 4): init pmi_version=1 pmi_subversion=1 [proxy:0:1@s051rhlapp02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@s051rhlapp02] got pmi command (from 4): get_maxes [proxy:0:1@s051rhlapp02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@s051rhlapp02] got pmi command (from 4): get_appnum [proxy:0:1@s051rhlapp02] PMI response: cmd=appnum appnum=0 [proxy:0:1@s051rhlapp02] got pmi command (from 4): get_my_kvsname [proxy:0:1@s051rhlapp02] PMI response: cmd=my_kvsname kvsname=kvs_4180_0 [proxy:0:1@s051rhlapp02] got pmi command (from 4): get_my_kvsname [proxy:0:1@s051rhlapp02] PMI response: cmd=my_kvsname kvsname=kvs_4180_0 [proxy:0:1@s051rhlapp02] got pmi command (from 4): get kvsname=kvs_4180_0 key=PMI_process_mapping [proxy:0:1@s051rhlapp02] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,2,2)) [proxy:0:1@s051rhlapp02] got pmi command (from 5): init pmi_version=1 pmi_subversion=1 [proxy:0:1@s051rhlapp02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@s051rhlapp02] got pmi command (from 4): put kvsname=kvs_4180_0 key=sharedFilename[2] value=/dev/shm/mpich_shar_tmpUmaL3v [proxy:0:1@s051rhlapp02] we don't understand this command put; forwarding upstream [proxy:0:1@s051rhlapp02] got pmi command (from 5): get_maxes [proxy:0:1@s051rhlapp02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@s051rhlapp02] got pmi command (from 5): get_appnum [proxy:0:1@s051rhlapp02] PMI response: cmd=appnum appnum=0 [proxy:0:1@s051rhlapp02] got pmi command (from 5): get_my_kvsname [proxy:0:1@s051rhlapp02] PMI response: cmd=my_kvsname kvsname=kvs_4180_0 [proxy:0:1@s051rhlapp02] got pmi command (from 5): get_my_kvsname [proxy:0:1@s051rhlapp02] PMI response: cmd=my_kvsname kvsname=kvs_4180_0 [proxy:0:1@s051rhlapp02] got pmi command (from 5): get kvsname=kvs_4180_0 key=PMI_process_mapping [proxy:0:1@s051rhlapp02] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,2,2)) [proxy:0:1@s051rhlapp02] got pmi command (from 5): barrier_in [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4180_0 key=sharedFilename[2] value=/dev/shm/mpich_shar_tmpUmaL3v [mpiexec@s051rhlapp01] PMI response to fd 7 pid 4: cmd=put_result rc=0 msg=success [proxy:0:1@s051rhlapp02] we don't understand the response put_result; forwarding downstream [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@s051rhlapp01] PMI response to fd 6 pid 4: cmd=barrier_out [mpiexec@s051rhlapp01] PMI response to fd 7 pid 4: cmd=barrier_out [proxy:0:0@s051rhlapp01] PMI response: cmd=barrier_out [proxy:0:0@s051rhlapp01] PMI response: cmd=barrier_out [proxy:0:1@s051rhlapp02] got pmi command (from 4): barrier_in [proxy:0:1@s051rhlapp02] forwarding command (cmd=barrier_in) upstream [proxy:0:0@s051rhlapp01] got pmi command (from 6): get kvsname=kvs_4180_0 key=sharedFilename[0] [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4180_0 key=sharedFilename[0] [mpiexec@s051rhlapp01] PMI response to fd 6 pid 6: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmptwaYml [proxy:0:0@s051rhlapp01] forwarding command (cmd=get kvsname=kvs_4180_0 key=sharedFilename[0]) upstream [proxy:0:0@s051rhlapp01] we don't understand the response get_result; forwarding downstream [proxy:0:1@s051rhlapp02] PMI response: cmd=barrier_out [proxy:0:1@s051rhlapp02] PMI response: cmd=barrier_out [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4180_0 key=sharedFilename[2] [mpiexec@s051rhlapp01] PMI response to fd 7 pid 5: cmd=get_result rc=0 msg=success value=/dev/shm/mpich_shar_tmpUmaL3v [proxy:0:1@s051rhlapp02] got pmi command (from 5): get kvsname=kvs_4180_0 key=sharedFilename[2] [proxy:0:1@s051rhlapp02] forwarding command (cmd=get kvsname=kvs_4180_0 key=sharedFilename[2]) upstream [proxy:0:1@s051rhlapp02] we don't understand the response get_result; forwarding downstream [proxy:0:0@s051rhlapp01] got pmi command (from 0): put kvsname=kvs_4180_0 key=P0-businesscard value=description#s051rhlapp01$port#56211$ifname#127.0.0.1$ [proxy:0:0@s051rhlapp01] we don't understand this command put; forwarding upstream [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4180_0 key=P0-businesscard value=description#s051rhlapp01$port#56211$ifname#127.0.0.1$ [mpiexec@s051rhlapp01] PMI response to fd 6 pid 0: cmd=put_result rc=0 msg=success [proxy:0:0@s051rhlapp01] got pmi command (from 6): put kvsname=kvs_4180_0 key=P1-businesscard value=description#s051rhlapp01$port#59656$ifname#127.0.0.1$ [proxy:0:0@s051rhlapp01] we don't understand this command put; forwarding upstream [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4180_0 key=P1-businesscard value=description#s051rhlapp01$port#59656$ifname#127.0.0.1$ [mpiexec@s051rhlapp01] PMI response to fd 6 pid 6: cmd=put_result rc=0 msg=success [proxy:0:0@s051rhlapp01] we don't understand the response put_result; forwarding downstream [proxy:0:0@s051rhlapp01] got pmi command (from 0): barrier_in [proxy:0:0@s051rhlapp01] we don't understand the response put_result; forwarding downstream [proxy:0:0@s051rhlapp01] got pmi command (from 6): barrier_in [proxy:0:0@s051rhlapp01] forwarding command (cmd=barrier_in) upstream [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4180_0 key=P2-businesscard value=description#s051rhlapp02$port#49318$ifname#127.0.0.1$ [mpiexec@s051rhlapp01] PMI response to fd 7 pid 4: cmd=put_result rc=0 msg=success [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=put kvsname=kvs_4180_0 key=P3-businesscard value=description#s051rhlapp02$port#59274$ifname#127.0.0.1$ [mpiexec@s051rhlapp01] PMI response to fd 7 pid 5: cmd=put_result rc=0 msg=success [proxy:0:1@s051rhlapp02] got pmi command (from 4): put kvsname=kvs_4180_0 key=P2-businesscard value=description#s051rhlapp02$port#49318$ifname#127.0.0.1$ [proxy:0:1@s051rhlapp02] we don't understand this command put; forwarding upstream [proxy:0:1@s051rhlapp02] got pmi command (from 5): put kvsname=kvs_4180_0 key=P3-businesscard value=description#s051rhlapp02$port#59274$ifname#127.0.0.1$ [proxy:0:1@s051rhlapp02] we don't understand this command put; forwarding upstream [proxy:0:1@s051rhlapp02] we don't understand the response put_result; forwarding downstream [proxy:0:1@s051rhlapp02] got pmi command (from 4): barrier_in [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@s051rhlapp01] PMI response to fd 6 pid 5: cmd=barrier_out [mpiexec@s051rhlapp01] PMI response to fd 7 pid 5: cmd=barrier_out [proxy:0:0@s051rhlapp01] PMI response: cmd=barrier_out [proxy:0:0@s051rhlapp01] PMI response: cmd=barrier_out [proxy:0:1@s051rhlapp02] we don't understand the response put_result; forwarding downstream [proxy:0:1@s051rhlapp02] got pmi command (from 5): barrier_in [proxy:0:1@s051rhlapp02] forwarding command (cmd=barrier_in) upstream [proxy:0:1@s051rhlapp02] PMI response: cmd=barrier_out [proxy:0:1@s051rhlapp02] PMI response: cmd=barrier_out [proxy:0:0@s051rhlapp01] got pmi command (from 0): get kvsname=kvs_4180_0 key=P2-businesscard [mpiexec@s051rhlapp01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_4180_0 key=P2-businesscard [mpiexec@s051rhlapp01] PMI response to fd 6 pid 0: cmd=get_result rc=0 msg=success value=description#s051rhlapp02$port#49318$ifname#127.0.0.1$ [proxy:0:0@s051rhlapp01] forwarding command (cmd=get kvsname=kvs_4180_0 key=P2-businesscard) upstream [proxy:0:0@s051rhlapp01] we don't understand the response get_result; forwarding downstream Fatal error in PMPI_Bcast: Other MPI error, error stack: PMPI_Bcast(1430)......................: MPI_Bcast(buf=0xbffb375c, count=5000, MPI_CHAR, root=0, MPI_COMM_WORLD) failed MPIR_Bcast_impl(1273).................: MPIR_Bcast_intra(1071)................: MPIR_Bcast_scatter_ring_allgather(914): MPIR_Bcast_binomial(202)..............: Failure during collective MPIR_Bcast_scatter_ring_allgather(907): MPIR_Bcast_binomial(178)..............: MPIC_Send(63).........................: MPIDI_EagerContigSend(186)............: failure occurred while attempting to send an eager message MPIDI_CH3_iStartMsgv(44)..............: Communication error with rank 2 [mpiexec@s051rhlapp01] ONE OF THE PROCESSES TERMINATED BADLY: CLEANING UP [proxy:0:1@s051rhlapp02] HYD_pmcd_pmip_control_cmd_cb (/usr/local/mpich2-1.3.2p1/src/pm/hydra/pm/pmiserv/pmip_cb.c:868): assert (!closed) failed [proxy:0:1@s051rhlapp02] HYDT_dmxu_poll_wait_for_event (/usr/local/mpich2-1.3.2p1/src/pm/hydra/tools/demux/demux_poll.c:77): callback returned error status [proxy:0:1@s051rhlapp02] main (/usr/local/mpich2-1.3.2p1/src/pm/hydra/pm/pmiserv/pmip.c:208): demux engine error waiting for event [mpiexec@s051rhlapp01] HYDT_bscu_wait_for_completion (/usr/local/mpich2-1.3.2p1/src/pm/hydra/tools/bootstrap/utils/bscu_wait.c:70): one of the processes terminated badly; aborting [mpiexec@s051rhlapp01] HYDT_bsci_wait_for_completion (/usr/local/mpich2-1.3.2p1/src/pm/hydra/tools/bootstrap/src/bsci_wait.c:18): launcher returned error waiting for completion [mpiexec@s051rhlapp01] HYD_pmci_wait_for_completion (/usr/local/mpich2-1.3.2p1/src/pm/hydra/pm/pmiserv/pmiserv_pmci.c:216): launcher returned error waiting for completion [mpiexec@s051rhlapp01] main (/usr/local/mpich2-1.3.2p1/src/pm/hydra/ui/mpich/mpiexec.c:404): process manager error waiting for completion 0.038u 0.020s 0:00.84 5.9% 0+0k 0+0io 0pf+0w date Mon Apr 25 12:25:25 MDT 2011 exit