[petsc-users] make check error

Barry Smith bsmith at mcs.anl.gov
Wed Mar 11 13:24:03 CDT 2015


  This is a problem with your MPI installation. You need to talk to a local systems person or check the documentation on how to set up your environment to run MPI jobs on that system.

  Barry

> On Mar 11, 2015, at 1:20 PM, Manav Bhatia <bhatiamanav at gmail.com> wrote:
> 
> One a different machine, but with same version of code, I am getting the following error. There is some complaint about unable to open /dev/infiniband/rdma_cm. 
> 
> Any pointers would be greatly appreciated. 
> 
> Thanks,
> Manav
> 
> 
> shadow-login[238] bhatia$ make check
> Running test examples to verify correct installation
> Using PETSC_DIR=/work/bhatia/codes/shadow/petsc/petsc-3.5.3 and PETSC_ARCH=arch-linux2-cxx-opt
> Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1 MPI process
> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> librdmacm: Warning: couldn't read ABI version.
> librdmacm: Warning: assuming: 4
> librdmacm: Fatal: unable to open /dev/infiniband/rdma_cm
> librdmacm: Fatal: unable to open /dev/infiniband/rdma_cm
> --------------------------------------------------------------------------
> WARNING: There are more than one active ports on host 'shadow-login', but the
> default subnet GID prefix was detected on more than one of these
> ports.  If these ports are connected to different physical IB
> networks, this configuration will fail in Open MPI.  This version of
> Open MPI requires that every physically separate IB subnet that is
> used between connected MPI processes must have different subnet ID
> values.
> 
> Please see this FAQ entry for more details:
> 
>   http://www.open-mpi.org/faq/?category=openfabrics#ofa-default-subnet-gid
> 
> NOTE: You can turn off this warning by setting the MCA parameter
>       btl_openib_warn_default_gid_prefix to 0.
> --------------------------------------------------------------------------
> --------------------------------------------------------------------------
> No OpenFabrics connection schemes reported that they were able to be
> used on a specific port.  As such, the openib BTL (OpenFabrics
> support) will be disabled for this port.
> 
>   Local host:           shadow-login
>   Local device:         mlx4_1
>   Local port:           1
>   CPCs attempted:       udcm
> --------------------------------------------------------------------------
> --------------------------------------------------------------------------
> WARNING: It appears that your OpenFabrics subsystem is configured to only
> allow registering part of your physical memory.  This can cause MPI jobs to
> run with erratic performance, hang, and/or crash.
> 
> This may be caused by your OpenFabrics vendor limiting the amount of
> physical memory that can be registered.  You should investigate the
> relevant Linux kernel module parameters that control how much physical
> memory can be registered, and increase them to allow registering all
> physical memory on your machine.
> 
> See this Open MPI FAQ item for more information on these Linux kernel module
> parameters:
> 
>     http://www.open-mpi.org/faq/?category=openfabrics#ib-locked-pages
> 
>   Local host:              shadow-login
>   Registerable memory:     24576 MiB
>   Total memory:            65457 MiB
> 
> Your MPI job will continue, but may be behave poorly and/or hang.
> --------------------------------------------------------------------------
> lid velocity = 0.0016, prandtl # = 1, grashof # = 1
> Number of SNES iterations = 2
> Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2 MPI processes
> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> librdmacm: Warning: couldn't read ABI version.
> librdmacm: Warning: assuming: 4
> librdmacm: Fatal: unable to open /dev/infiniband/rdma_cm
> librdmacm: Fatal: unable to open /dev/infiniband/rdma_cm
> librdmacm: Warning: couldn't read ABI version.
> librdmacm: Warning: assuming: 4
> librdmacm: Fatal: unable to open /dev/infiniband/rdma_cm
> librdmacm: Fatal: unable to open /dev/infiniband/rdma_cm
> --------------------------------------------------------------------------
> WARNING: There are more than one active ports on host 'shadow-login', but the
> default subnet GID prefix was detected on more than one of these
> ports.  If these ports are connected to different physical IB
> networks, this configuration will fail in Open MPI.  This version of
> Open MPI requires that every physically separate IB subnet that is
> used between connected MPI processes must have different subnet ID
> values.
> 
> Please see this FAQ entry for more details:
> 
>   http://www.open-mpi.org/faq/?category=openfabrics#ofa-default-subnet-gid
> 
> NOTE: You can turn off this warning by setting the MCA parameter
>       btl_openib_warn_default_gid_prefix to 0.
> --------------------------------------------------------------------------
> --------------------------------------------------------------------------
> No OpenFabrics connection schemes reported that they were able to be
> used on a specific port.  As such, the openib BTL (OpenFabrics
> support) will be disabled for this port.
> 
>   Local host:           shadow-login
>   Local device:         mlx4_1
>   Local port:           1
>   CPCs attempted:       udcm
> --------------------------------------------------------------------------
> --------------------------------------------------------------------------
> WARNING: It appears that your OpenFabrics subsystem is configured to only
> allow registering part of your physical memory.  This can cause MPI jobs to
> run with erratic performance, hang, and/or crash.
> 
> This may be caused by your OpenFabrics vendor limiting the amount of
> physical memory that can be registered.  You should investigate the
> relevant Linux kernel module parameters that control how much physical
> memory can be registered, and increase them to allow registering all
> physical memory on your machine.
> 
> See this Open MPI FAQ item for more information on these Linux kernel module
> parameters:
> 
>     http://www.open-mpi.org/faq/?category=openfabrics#ib-locked-pages
> 
>   Local host:              shadow-login
>   Registerable memory:     24576 MiB
>   Total memory:            65457 MiB
> 
> Your MPI job will continue, but may be behave poorly and/or hang.
> --------------------------------------------------------------------------
> lid velocity = 0.0016, prandtl # = 1, grashof # = 1
> Number of SNES iterations = 2
> --------------------------------------------------------------------------
> mpirun noticed that process rank 1 with PID 196567 on node shadow-login exited on signal 11 (Segmentation fault).
> --------------------------------------------------------------------------
> [shadow-login:196564] 1 more process has sent help message help-mpi-btl-openib.txt / default subnet prefix
> [shadow-login:196564] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
> [shadow-login:196564] 1 more process has sent help message help-mpi-btl-openib-cpc-base.txt / no cpcs for port
> [shadow-login:196564] 1 more process has sent help message help-mpi-btl-openib.txt / reg mem limit low
> Possible error running Fortran example src/snes/examples/tutorials/ex5f with 1 MPI process
> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> librdmacm: Warning: couldn't read ABI version.
> librdmacm: Warning: assuming: 4
> librdmacm: Fatal: unable to open /dev/infiniband/rdma_cm
> librdmacm: Fatal: unable to open /dev/infiniband/rdma_cm
> --------------------------------------------------------------------------
> WARNING: There are more than one active ports on host 'shadow-login', but the
> default subnet GID prefix was detected on more than one of these
> ports.  If these ports are connected to different physical IB
> networks, this configuration will fail in Open MPI.  This version of
> Open MPI requires that every physically separate IB subnet that is
> used between connected MPI processes must have different subnet ID
> values.
> 
> Please see this FAQ entry for more details:
> 
>   http://www.open-mpi.org/faq/?category=openfabrics#ofa-default-subnet-gid
> 
> NOTE: You can turn off this warning by setting the MCA parameter
>       btl_openib_warn_default_gid_prefix to 0.
> --------------------------------------------------------------------------
> --------------------------------------------------------------------------
> No OpenFabrics connection schemes reported that they were able to be
> used on a specific port.  As such, the openib BTL (OpenFabrics
> support) will be disabled for this port.
> 
>   Local host:           shadow-login
>   Local device:         mlx4_1
>   Local port:           1
>   CPCs attempted:       udcm
> --------------------------------------------------------------------------
> --------------------------------------------------------------------------
> WARNING: It appears that your OpenFabrics subsystem is configured to only
> allow registering part of your physical memory.  This can cause MPI jobs to
> run with erratic performance, hang, and/or crash.
> 
> This may be caused by your OpenFabrics vendor limiting the amount of
> physical memory that can be registered.  You should investigate the
> relevant Linux kernel module parameters that control how much physical
> memory can be registered, and increase them to allow registering all
> physical memory on your machine.
> 
> See this Open MPI FAQ item for more information on these Linux kernel module
> parameters:
> 
>     http://www.open-mpi.org/faq/?category=openfabrics#ib-locked-pages
> 
>   Local host:              shadow-login
>   Registerable memory:     24576 MiB
>   Total memory:            65457 MiB
> 
> Your MPI job will continue, but may be behave poorly and/or hang.
> --------------------------------------------------------------------------
> Number of SNES iterations =     4
> Completed test examples
> =========================================
> Now to evaluate the computer systems you plan use - do:
> make PETSC_DIR=/work/bhatia/codes/shadow/petsc/petsc-3.5.3 PETSC_ARCH=arch-linux2-cxx-opt streams NPMAX=<number of MPI processes you intend to use>
> shadow-login[239] bhatia$ 
> 
> 
> 
>> On Mar 11, 2015, at 1:12 PM, Manav Bhatia <bhatiamanav at gmail.com> wrote:
>> 
>> Greetings! 
>> 
>>    I have come across the following error while running make check. I am not sure where to begin to sort this out. Any pointers would be greatly appreciated. 
>> 
>> Talon-login[114] bhatia$ make check
>> gmake[1]: Entering directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3'
>> Running test examples to verify correct installation
>> Using PETSC_DIR=/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3 and PETSC_ARCH=arch-linux2-cxx-opt
>> gmake[2]: Entering directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials'
>> gmake[2]: Leaving directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials'
>> gmake[2]: Entering directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials'
>> *******************Error detected during compile or link!*******************
>> See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> /cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials ex19
>> *********************************************************************************
>> gmake[3]: Entering directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials'
>> gmake[4]: Entering directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials'
>> gmake[4]: Leaving directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials'
>> mpicxx -o ex19.o -c -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -O      -I/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/include -I/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/arch-linux2-cxx-opt/include -I/usr/local/openssl/x86_64/include -I/usr/local/mpi/x86_64/openmpi/include -I/usr/local/mpi/x86_64/openmpi-1.4.2/include    `pwd`/ex19.c
>> mpicxx -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -O   -o ex19  ex19.o -L/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/arch-linux2-cxx-opt/lib  -lpetsc -Wl,-rpath,/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/arch-linux2-cxx-opt/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lscalapack -lsuperlu_4.3 -lsuperlu_dist_3.3 -Wl,-rpath,/cavs/projects/sams/codes/raptor/lapack -L/cavs/projects/sams/codes/raptor/lapack -llapack -Wl,-rpath,/cavs/projects/sams/codes/raptor/blas -L/cavs/projects/sams/codes/raptor/blas -lblas -lparmetis -lmetis -lpthread -Wl,-rpath,/usr/local/openssl/x86_64/lib -L/usr/local/openssl/x86_64/lib -lssl -lcrypto -Wl,-rpath,/usr/local/mpi/x86_64/openmpi/lib -L/usr/local/mpi/x86_64/openmpi/lib -lmpi_cxx -lmpi -L/usr/local/mpi/x86_64/openmpi-1.4.2/lib -L/usr/local/server/x86_64-SuSE10/gnu/gcc-4.8.3/lib/gcc/x86_64-unknown-linux-gnu/4.8.3 -L/usr/local/server/x86_64-SuSE10/gnu/gcc-4.8.3/lib/gcc -L/usr/local/server/x86_64-SuSE10/gnu/gcc-4.8.3/lib64 -L/usr/local/server/x86_64-SuSE10/gnu/gcc-4.8.3/lib -lmpi_f90 -lmpi_f77 -lgfortran -lm -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -L/usr/local/mpi/x86_64/openmpi-1.4.2/lib -L/usr/local/server/x86_64-SuSE10/gnu/gcc-4.8.3/lib/gcc/x86_64-unknown-linux-gnu/4.8.3 -L/usr/local/server/x86_64-SuSE10/gnu/gcc-4.8.3/lib/gcc -L/usr/local/server/x86_64-SuSE10/gnu/gcc-4.8.3/lib64 -L/usr/local/server/x86_64-SuSE10/gnu/gcc-4.8.3/lib -ldl -lmpi -lopen-rte -lopen-pal -lnsl -lutil -lgcc_s -lpthread -ldl  
>> /usr/local/gnu/gcc-4.8.3/binutils/bin/ld: warning: libgfortran.so.1, needed by /usr/local/mpi/x86_64/openmpi/lib/libmpi_f90.so, may conflict with libgfortran.so.3
>> /usr/local/gnu/gcc-4.8.3/binutils/bin/ld: warning: libgfortran.so.1, needed by /usr/local/mpi/x86_64/openmpi/lib/libmpi_f90.so, may conflict with libgfortran.so.3
>> /usr/local/gnu/gcc-4.8.3/binutils/bin/ld: warning: libgfortran.so.1, needed by /usr/local/mpi/x86_64/openmpi/lib/libmpi_f90.so, may conflict with libgfortran.so.3
>> /bin/rm -f ex19.o
>> gmake[3]: Leaving directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials'
>> Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1 MPI process
>> See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> libibverbs: Warning: RLIMIT_MEMLOCK is 32768 bytes.
>>     This will severely limit memory registrations.
>> libibverbs: Warning: couldn't open config directory '/etc/libibverbs.d'.
>> libibverbs: Warning: no userspace device-specific driver found for /sys/class/infiniband_verbs/uverbs0
>> CMA: unable to open /dev/infiniband/rdma_cm
>> libibverbs: Warning: RLIMIT_MEMLOCK is 32768 bytes.
>>     This will severely limit memory registrations.
>> libibverbs: Warning: couldn't open config directory '/etc/libibverbs.d'.
>> [Talon-login:18281] *** Process received signal ***
>> [Talon-login:18281] Signal: Segmentation fault (11)
>> [Talon-login:18281] Signal code: Address not mapped (1)
>> [Talon-login:18281] Failing at address: 0x100002990
>> [Talon-login:18281] [ 0] /lib64/libpthread.so.0 [0x2b8e9cbc6c00]
>> [Talon-login:18281] [ 1] /usr/ofed/lib64/libibverbs.so.1 [0x2b8e9fcad4ce]
>> [Talon-login:18281] [ 2] /usr/ofed/lib64/libibverbs.so.1(ibv_get_device_list+0x9e) [0x2b8e9fcac49e]
>> [Talon-login:18281] [ 3] /usr/local/mpi/x86_64/openmpi-1.4.2/lib/openmpi/mca_btl_openib.so [0x2b8e9fdbe15a]
>> [Talon-login:18281] [ 4] /usr/local/mpi/x86_64/openmpi/lib/libmpi.so.0(mca_btl_base_select+0x156) [0x2b8e9d223166]
>> [Talon-login:18281] [ 5] /usr/local/mpi/x86_64/openmpi-1.4.2/lib/openmpi/mca_bml_r2.so [0x2b8e9f997821]
>> [Talon-login:18281] [ 6] /usr/local/mpi/x86_64/openmpi/lib/libmpi.so.0(mca_bml_base_init+0x72) [0x2b8e9d222972]
>> [Talon-login:18281] [ 7] /usr/local/mpi/x86_64/openmpi-1.4.2/lib/openmpi/mca_pml_ob1.so [0x2b8e9f77fc8f]
>> [Talon-login:18281] [ 8] /usr/local/mpi/x86_64/openmpi/lib/libmpi.so.0(mca_pml_base_select+0x383) [0x2b8e9d22c743]
>> [Talon-login:18281] [ 9] /usr/local/mpi/x86_64/openmpi/lib/libmpi.so.0 [0x2b8e9d1f0050]
>> [Talon-login:18281] [10] /usr/local/mpi/x86_64/openmpi/lib/libmpi.so.0(PMPI_Init_thread+0xc1) [0x2b8e9d20db41]
>> [Talon-login:18281] [11] ./ex19(PetscInitialize+0x1bd) [0x4990fe]
>> [Talon-login:18281] [12] ./ex19(main+0x2a) [0x48471e]
>> [Talon-login:18281] [13] /lib64/libc.so.6(__libc_start_main+0xf4) [0x2b8e9e331164]
>> [Talon-login:18281] [14] ./ex19 [0x482639]
>> [Talon-login:18281] *** End of error message ***
>> --------------------------------------------------------------------------
>> mpirun noticed that process rank 0 with PID 18281 on node Talon-login.HPC.MsState.Edu exited on signal 11 (Segmentation fault).
>> --------------------------------------------------------------------------
>> Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2 MPI processes
>> See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> libibverbs: Warning: RLIMIT_MEMLOCK is 32768 bytes.
>>     This will severely limit memory registrations.
>> libibverbs: Warning: couldn't open config directory '/etc/libibverbs.d'.
>> libibverbs: Warning: no userspace device-specific driver found for /sys/class/infiniband_verbs/uverbs0
>> CMA: unable to open /dev/infiniband/rdma_cm
>> libibverbs: Warning: RLIMIT_MEMLOCK is 32768 bytes.
>>     This will severely limit memory registrations.
>> libibverbs: Warning: couldn't open config directory '/etc/libibverbs.d'.
>> [Talon-login:18286] *** Process received signal ***
>> [Talon-login:18286] Signal: Segmentation fault (11)
>> [Talon-login:18286] Signal code: Address not mapped (1)
>> [Talon-login:18286] Failing at address: 0x100002990
>> [Talon-login:18286] [ 0] /lib64/libpthread.so.0 [0x2acc0604ec00]
>> [Talon-login:18286] [ 1] /usr/ofed/lib64/libibverbs.so.1 [0x2acc091354ce]
>> [Talon-login:18286] [ 2] /usr/ofed/lib64/libibverbs.so.1(ibv_get_device_list+0x9e) [0x2acc0913449e]
>> [Talon-login:18286] [ 3] /usr/local/mpi/x86_64/openmpi-1.4.2/lib/openmpi/mca_btl_openib.so [0x2acc0924615a]
>> [Talon-login:18286] [ 4] /usr/local/mpi/x86_64/openmpi/lib/libmpi.so.0(mca_btl_base_select+0x156) [0x2acc066ab166]
>> [Talon-login:18286] [ 5] /usr/local/mpi/x86_64/openmpi-1.4.2/lib/openmpi/mca_bml_r2.so [0x2acc08e1f821]
>> [Talon-login:18286] [ 6] /usr/local/mpi/x86_64/openmpi/lib/libmpi.so.0(mca_bml_base_init+0x72) [0x2acc066aa972]
>> [Talon-login:18286] [ 7] /usr/local/mpi/x86_64/openmpi-1.4.2/lib/openmpi/mca_pml_ob1.so [0x2acc08c07c8f]
>> [Talon-login:18286] [ 8] /usr/local/mpi/x86_64/openmpi/lib/libmpi.so.0(mca_pml_base_select+0x383) [0x2acc066b4743]
>> [Talon-login:18286] [ 9] /usr/local/mpi/x86_64/openmpi/lib/libmpi.so.0 [0x2acc06678050]
>> [Talon-login:18286] [10] /usr/local/mpi/x86_64/openmpi/lib/libmpi.so.0(PMPI_Init_thread+0xc1) [0x2acc06695b41]
>> [Talon-login:18286] [11] ./ex19(PetscInitialize+0x1bd) [0x4990fe]
>> [Talon-login:18286] [12] ./ex19(main+0x2a) [0x48471e]
>> [Talon-login:18286] [13] /lib64/libc.so.6(__libc_start_main+0xf4) [0x2acc077b9164]
>> [Talon-login:18286] [14] ./ex19 [0x482639]
>> [Talon-login:18286] *** End of error message ***
>> --------------------------------------------------------------------------
>> mpirun noticed that process rank 0 with PID 18286 on node Talon-login.HPC.MsState.Edu exited on signal 11 (Segmentation fault).
>> --------------------------------------------------------------------------
>> gmake[3]: Entering directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials'
>> gmake[3]: Leaving directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials'
>> gmake[2]: Leaving directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials'
>> gmake[2]: Entering directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials'
>> *******************Error detected during compile or link!*******************
>> See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> /cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials ex5f
>> *********************************************************
>> gmake[3]: Entering directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials'
>> mpif90 -c  -Wall -Wno-unused-variable -ffree-line-length-0 -Wno-unused-dummy-argument -O   -I/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/include -I/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/arch-linux2-cxx-opt/include -I/usr/local/openssl/x86_64/include -I/usr/local/mpi/x86_64/openmpi/include -I/usr/local/mpi/x86_64/openmpi-1.4.2/include    -o ex5f.o ex5f.F
>> mpif90 -Wall -Wno-unused-variable -ffree-line-length-0 -Wno-unused-dummy-argument -O   -o ex5f ex5f.o -L/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/arch-linux2-cxx-opt/lib  -lpetsc -Wl,-rpath,/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/arch-linux2-cxx-opt/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lscalapack -lsuperlu_4.3 -lsuperlu_dist_3.3 -Wl,-rpath,/cavs/projects/sams/codes/raptor/lapack -L/cavs/projects/sams/codes/raptor/lapack -llapack -Wl,-rpath,/cavs/projects/sams/codes/raptor/blas -L/cavs/projects/sams/codes/raptor/blas -lblas -lparmetis -lmetis -lpthread -Wl,-rpath,/usr/local/openssl/x86_64/lib -L/usr/local/openssl/x86_64/lib -lssl -lcrypto -Wl,-rpath,/usr/local/mpi/x86_64/openmpi/lib -L/usr/local/mpi/x86_64/openmpi/lib -lmpi_cxx -lmpi -L/usr/local/mpi/x86_64/openmpi-1.4.2/lib -L/usr/local/server/x86_64-SuSE10/gnu/gcc-4.8.3/lib/gcc/x86_64-unknown-linux-gnu/4.8.3 -L/usr/local/server/x86_64-SuSE10/gnu/gcc-4.8.3/lib/gcc -L/usr/local/server/x86_64-SuSE10/gnu/gcc-4.8.3/lib64 -L/usr/local/server/x86_64-SuSE10/gnu/gcc-4.8.3/lib -lmpi_f90 -lmpi_f77 -lgfortran -lm -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -L/usr/local/mpi/x86_64/openmpi-1.4.2/lib -L/usr/local/server/x86_64-SuSE10/gnu/gcc-4.8.3/lib/gcc/x86_64-unknown-linux-gnu/4.8.3 -L/usr/local/server/x86_64-SuSE10/gnu/gcc-4.8.3/lib/gcc -L/usr/local/server/x86_64-SuSE10/gnu/gcc-4.8.3/lib64 -L/usr/local/server/x86_64-SuSE10/gnu/gcc-4.8.3/lib -ldl -lmpi -lopen-rte -lopen-pal -lnsl -lutil -lgcc_s -lpthread -ldl  
>> /usr/local/gnu/gcc-4.8.3/binutils/bin/ld: warning: libgfortran.so.1, needed by /usr/local/mpi/x86_64/openmpi/lib/libmpi_f90.so, may conflict with libgfortran.so.3
>> /usr/local/gnu/gcc-4.8.3/binutils/bin/ld: warning: libgfortran.so.1, needed by /usr/local/mpi/x86_64/openmpi/lib/libmpi_f90.so, may conflict with libgfortran.so.3
>> /usr/local/gnu/gcc-4.8.3/binutils/bin/ld: warning: libgfortran.so.1, needed by /usr/local/mpi/x86_64/openmpi/lib/libmpi_f90.so, may conflict with libgfortran.so.3
>> /usr/local/gnu/gcc-4.8.3/binutils/bin/ld: warning: libgfortran.so.1, needed by /usr/local/mpi/x86_64/openmpi/lib/libmpi_f90.so, may conflict with libgfortran.so.3
>> /usr/local/gnu/gcc-4.8.3/binutils/bin/ld: warning: libgfortran.so.1, needed by /usr/local/mpi/x86_64/openmpi/lib/libmpi_f90.so, may conflict with libgfortran.so.3
>> /usr/local/gnu/gcc-4.8.3/binutils/bin/ld: warning: libgfortran.so.1, needed by /usr/local/mpi/x86_64/openmpi/lib/libmpi_f90.so, may conflict with libgfortran.so.3
>> /bin/rm -f ex5f.o
>> gmake[3]: Leaving directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials'
>> Possible error running Fortran example src/snes/examples/tutorials/ex5f with 1 MPI process
>> See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> libibverbs: Warning: RLIMIT_MEMLOCK is 32768 bytes.
>>     This will severely limit memory registrations.
>> libibverbs: Warning: couldn't open config directory '/etc/libibverbs.d'.
>> libibverbs: Warning: no userspace device-specific driver found for /sys/class/infiniband_verbs/uverbs0
>> CMA: unable to open /dev/infiniband/rdma_cm
>> libibverbs: Warning: RLIMIT_MEMLOCK is 32768 bytes.
>>     This will severely limit memory registrations.
>> libibverbs: Warning: couldn't open config directory '/etc/libibverbs.d'.
>> 
>> Program received signal SIGSEGV: Segmentation fault - invalid memory reference.
>> 
>> Backtrace for this error:
>> #0  0x2B6CD4A6F4D7
>> #1  0x2B6CD4A6FADE
>> #2  0x2B6CD5805E1F
>> #3  0x2B6CD716F4CE
>> #4  0x2B6CD716E49D
>> #5  0x2B6CD7280159
>> #6  0x2B6CD46E5165
>> #7  0x2B6CD6E59820
>> #8  0x2B6CD46E4971
>> #9  0x2B6CD6C41C8E
>> #10  0x2B6CD46EE742
>> #11  0x2B6CD46B204F
>> #12  0x2B6CD46CF99F
>> #13  0x2B6CD494AA84
>> #14  0x489A6D in petscinitialize_
>> #15  0x487CEC in MAIN__ at ex5f.F:?
>> --------------------------------------------------------------------------
>> mpirun noticed that process rank 0 with PID 18354 on node Talon-login.HPC.MsState.Edu exited on signal 11 (Segmentation fault).
>> --------------------------------------------------------------------------
>> gmake[3]: Entering directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials'
>> gmake[3]: Leaving directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials'
>> gmake[2]: Leaving directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials'
>> gmake[2]: Entering directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials'
>> gmake[2]: Leaving directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3/src/snes/examples/tutorials'
>> Completed test examples
>> gmake[1]: Leaving directory `/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3'
>> =========================================
>> Now to evaluate the computer systems you plan use - do:
>> make PETSC_DIR=/cavs/projects/sams/codes/raptor/petsc/petsc-3.5.3 PETSC_ARCH=arch-linux2-cxx-opt streams NPMAX=<number of MPI processes you intend to use>
>> Talon-login[115] bhatia$ 
>> 
>> 
>> Thanks,
>> Manav
>> 
> 



More information about the petsc-users mailing list