[petsc-users] Error with PETSc on K computer

TAY wee-beng zonexo at gmail.com
Wed Jun 1 02:17:10 CDT 2016


Hi Satish,

Only partially working:

[t00196 at b04-036 tutorials]$ mpiexec -n 2 ./ex4f90
jwe1050i-w The hardware barrier couldn't be used and continues 
processing using the software barrier.
taken to (standard) corrective action, execution continuing.
jwe1050i-w The hardware barrier couldn't be used and continues 
processing using the software barrier.
taken to (standard) corrective action, execution continuing.
Vec Object:Vec Object:initial vector:initial vector: 1 MPI processes
   type: seq
10
20
30
40
50
60
  1 MPI processes
   type: seq
10
20
30
40
50
60
[1]PETSC ERROR: 
------------------------------------------------------------------------
[1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, 
probably memory access out of range
[1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[1]PETSC ERROR: or see 
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS 
X to find memory corruption errors
[1]PETSC ERROR: likely location of problem given in stack below
[1]PETSC ERROR: ---------------------  Stack Frames 
------------------------------------
[1]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[1]PETSC ERROR:       INSTEAD the line number of the start of the function
[1]PETSC ERROR:       is given.
[1]PETSC ERROR: [1] F90Array1dCreate line 50 
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c
[1]PETSC ERROR: --------------------- Error Message 
------------------------------------------[0]PETSC ERROR: 
------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, 
probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see 
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS 
X to find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: ---------------------  Stack Frames 
------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR:       INSTEAD the line number of the start of the function
[0]PETSC ERROR:       is given.
[0]PETSC ERROR: [0] F90Array1dCreate line 50 
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c
[0]PETSC ERROR: --------------------- Error Message 
--------------------------------------------------------------
[1]PETSC ERROR: Signal received
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
for trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
[1]PETSC ERROR: ./ex4f90 on a petsc-3.6.3_debug named b04-036 by Unknown 
Wed Jun  1 13:23:41 2016
[1]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC 
--with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0" 
--CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" 
--LD_SHARED= --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec 
--known-endian=big --with-shared-libraries=0 
--with-blas-lapack-lib=-SSL2 --with-scalapack-lib=-SCALAPACK 
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug 
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 
--with-hypre=1 --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4
[1]PETSC ERROR: #1 User provided function() line 0 in  unknown file
--------------------------------------------------------------------------
[mpi::mpi-api::mpi-abort]
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[b04-036:28998] 
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(orte_errmgr_base_error_abort+0x84) 
[0xffffffff11360404]
[b04-036:28998] 
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(ompi_mpi_abort+0x51c) 
[0xffffffff1110391c]
[b04-036:28998] 
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(MPI_Abort+0x6c) 
[0xffffffff1111b5ec]
[b04-036:28998] 
/opt/FJSVtclang/GM-1.2.0-20/lib64/libtrtmet_c.so.1(MPI_Abort+0x2c) 
[0xffffffff00281bf0]
[b04-036:28998] ./ex4f90 [0x292548]
[b04-036:28998] ./ex4f90 [0x29165c]
[b04-036:28998] 
/opt/FJSVxosmmm/lib64/libmpgpthread.so.1(_IO_funlockfile+0x5c) 
[0xffffffff121e1974]
[b04-036:28998] ./ex4f90 [0x9f6748]
[b04-036:28998] ./ex4f90 [0x9f0ea4]
[b04-036:28998] ./ex4f90 [0x2c76a0]
[b04-036:28998] ./ex4f90(MAIN__+0x38c) [0x10688c]
[b04-036:28998] ./ex4f90(main+0xec) [0x268e91c]
[b04-036:28998] /lib64/libc.so.6(__libc_start_main+0x194) 
[0xffffffff138cb81c]
[b04-036:28998] ./ex4f90 [0x1063ac]
[1]PETSC ERROR: 
------------------------------------------------------------------------
[1]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the 
batch system) has told this process to end
[1]PETSC ERROR: Tr--------------------
[0]PETSC ERROR: Signal received
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
[0]PETSC ERROR: ./ex4f90 on a petsc-3.6.3_debug named b04-036 by Unknown 
Wed Jun  1 13:23:41 2016
[0]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC 
--with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0" 
--CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" 
--LD_SHARED= --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec 
--known-endian=big --with-shared-libraries=0 
--with-blas-lapack-lib=-SSL2 --with-scalapack-lib=-SCALAPACK 
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug 
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 
--with-hypre=1 --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4
[0]PETSC ERROR: #1 User provided function() line 0 in  unknown file
--------------------------------------------------------------------------
[mpi::mpi-api::mpi-abort]
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[b04-036:28997] 
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(orte_errmgr_base_error_abort+0x84) 
[0xffffffff11360404]
[b04-036:28997] 
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(ompi_mpi_abort+0x51c) 
[0xffffffff1110391c]
[b04-036:28997] 
/opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(MPI_Abort+0x6c) 
[0xffffffff1111b5ec]
[b04-036:28997] 
/opt/FJSVtclang/GM-1.2.0-20/lib64/libtrtmet_c.so.1(MPI_Abort+0x2c) 
[0xffffffff00281bf0]
[b04-036:28997] ./ex4f90 [0x292548]
[b04-036:28997] ./ex4f90 [0x29165c]
[b04-036:28997] 
/opt/FJSVxosmmm/lib64/libmpgpthread.so.1(_IO_funlockfile+0x5c) 
[0xffffffff121e1974]
[b04-036:28997] ./ex4f90 [0x9f6748]
[b04-036:28997] ./ex4f90 [0x9f0ea4]
[b04-036:28997] ./ex4f90 [0x2c76a0]
[b04-036:28997] ./ex4f90(MAIN__+0x38c) [0x10688c]
[b04-036:28997] ./ex4f90(main+0xec) [0x268e91c]
[b04-036:28997] /lib64/libc.so.6(__libc_start_main+0x194) 
[0xffffffff138cb81c]
[b04-036:28997] ./ex4f90 [0x1063ac]
[0]PETSC ERROR: 
------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the 
batch system) has told this process to end
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see 
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS 
X to find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: ---------------------  Stack Frames 
------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR:       INSTEAD the line number of the start of the function
[0]PETSC ERROR:       is given.
[0]PETSC ERROR: [0] F90Array1dCreate line 50 
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c
[0]PETSC ERROR: --------------------- Error Message 
--------------------------------------------------------------
[0]PETSC ERROR: Signal received
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
[0]PETSC ERROR: ./ex4f90 on a petsc-3.6.3_debug named b04-036 by Unknown 
Wed Jun  1 13:23:41 2016
[0]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC 
--with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0" 
--CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" 
--LD_SHARED= --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec 
--known-endian=big --with-shared-libraries=0 
--with-blas-lapack-lib=-SSL2 --with-scalapack-lib=-SCALAPACK 
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug 
--with-fortran-interfaces=1 --with-debuy option -start_in_debugger or 
-on_error_attach_debugger
[1]PETSC ERROR: or see 
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS 
X to find memory corruption errors
[1]PETSC ERROR: likely location of problem given in stack below
[1]PETSC ERROR: ---------------------  Stack Frames 
------------------------------------
[1]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[1]PETSC ERROR:       INSTEAD the line number of the start of the function
[1]PETSC ERROR:       is given.
[1]PETSC ERROR: [1] F90Array1dCreate line 50 
/.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c
[1]PETSC ERROR: --------------------- Error Message 
--------------------------------------------------------------
[1]PETSC ERROR: Signal received
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
for trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
[1]PETSC ERROR: ./ex4f90 on a petsc-3.6.3_debug named b04-036 by Unknown 
Wed Jun  1 13:23:41 2016
[1]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC 
--with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0" 
--CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" 
--LD_SHARED= --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec 
--known-endian=big --with-shared-libraries=0 
--with-blas-lapack-lib=-SSL2 --with-scalapack-lib=-SCALAPACK 
--prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug 
--with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 
--with-hypre=1 --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4
[1]PETSC ERROR: #2 User provided function() line 0 in  unknown file
gging=1 --useThreads=0 --with-hypre=1 
--with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4
[0]PETSC ERROR: #2 User provided function() line 0 in  unknown file
[ERR.] PLE 0019 plexec One of MPI processes was 
aborted.(rank=0)(nid=0x04180034)(CODE=1938,793745140674134016,15104)
[t00196 at b04-036 tutorials]$
[ERR.] PLE 0021 plexec The interactive job has aborted with the 
signal.(sig=24)
[INFO] PJM 0083 pjsub Interactive job 5211401 completed.

Thank you

Yours sincerely,

TAY wee-beng

On 1/6/2016 12:21 PM, Satish Balay wrote:
> Do PETSc examples using VecGetArrayF90() work?
>
> say src/vec/vec/examples/tutorials/ex4f90.F
>
> Satish
>
> On Tue, 31 May 2016, TAY wee-beng wrote:
>
>> Hi,
>>
>> I'm trying to run my MPI CFD code on Japan's K computer. My code can run if I
>> didn't make use of the PETSc DMDAVecGetArrayF90 subroutine. If it's called
>>
>> call DMDAVecGetArrayF90(da_u,u_local,u_array,ierr)
>>
>> I get the error below.  I have no problem with my code on other clusters using
>> the new Intel compilers. I used to have problems with DM when using the old
>> Intel compilers. Now on the K computer, I'm using Fujitsu's Fortran compiler.
>> How can I troubleshoot?
>>
>> Btw, I also tested on the ex13f90 example and it didn't work too. The error is
>> below.
>>
>>
>> My code error:
>>
>> /* size_x,size_y,size_z 76x130x136*//*
>> *//* total grid size =  1343680*//*
>> *//* recommended cores (50k / core) =  26.87360000000000*//*
>> *//* 0*//*
>> *//* 1*//*
>> *//* 1*//*
>> *//*[3]PETSC ERROR: [1]PETSC ERROR:
>> ------------------------------------------------------------------------*//*
>> *//*[1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
>> probably memory access out of range*//*
>> *//*[1]PETSC ERROR: Try option -start_in_debugger or
>> -on_error_attach_debugger*//*
>> *//*[1]PETSC ERROR: or see
>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*
>> *//*[1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
>> to find memory corruption errors*//*
>> *//*[1]PETSC ERROR: likely location of problem given in stack below*//*
>> *//*[1]PETSC ERROR: ---------------------  Stack Frames
>> ------------------------------------*//*
>> *//*[1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
>> available,*//*
>> *//*[1]PETSC ERROR:       INSTEAD the line number of the start of the
>> function*//*
>> *//*[1]PETSC ERROR:       is given.*//*
>> *//*[1]PETSC ERROR: [1] F90Array3dCreate line 244
>> /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
>> *//* 1*//*
>> *//*------------------------------------------------------------------------*//*
>> *//*[3]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
>> probably memory access out of range*//*
>> *//*[3]PETSC ERROR: Try option -start_in_debugger or
>> -on_error_attach_debugger*//*
>> *//*[3]PETSC ERROR: or see
>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*
>> *//*[3]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
>> to find memory corruption errors*//*
>> *//*[3]PETSC ERROR: likely location of problem given in stack below*//*
>> *//*[3]PETSC ERROR: ---------------------  Stack Frames
>> ------------------------------------*//*
>> *//*[0]PETSC ERROR:
>> ------------------------------------------------------------------------*//*
>> *//*[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
>> probably memory access out of range*//*
>> *//*[0]PETSC ERROR: Try option -start_in_debugger or
>> -on_error_attach_debugger*//*
>> *//*[0]PETSC ERROR: or see
>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*
>> *//*[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
>> to find memory corruption errors*//*
>> *//*[0]PETSC ERROR: likely location of problem given in stack below*//*
>> *//*[0]PETSC ERROR: ---------------------  Stack Frames
>> ------------------------------------*//*
>> *//*[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
>> available,*//*
>> *//*[0]PETSC ERROR:       INSTEAD the line number of the start of the
>> function*//*
>> *//*[0]PETSC ERROR:       is given.*//*
>> *//*[0]PETSC ERROR: [0] F90Array3dCreate line 244
>> /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
>> *//*[0]PETSC ERROR: --------------------- Error Message
>> ----------------------------------------- 1*//*
>> *//*[2]PETSC ERROR:
>> ------------------------------------------------------------------------*//*
>> *//*[2]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
>> probably memory access out of range*//*
>> *//*[2]PETSC ERROR: Try option -start_in_debugger or
>> -on_error_attach_debugger*//*
>> *//*[2]PETSC ERROR: or see
>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*
>> *//*[2]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
>> to find memory corruption errors*//*
>> *//*[2]PETSC ERROR: likely location of problem given in stack below*//*
>> *//*[2]PETSC ERROR: ---------------------  Stack Frames
>> ------------------------------------*//*
>> *//*[2]PETSC ERROR: Note: The EXACT line numbers in the stack are not
>> available,*//*
>> *//*[2]PETSC ERROR:       INSTEAD the line number of the start of the
>> function*//*
>> *//*[2]PETSC ERROR:       is given.*//*
>> *//*[2]PETSC ERROR: [2] F90Array3dCreate line 244
>> /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
>> *//*[2]PETSC ERROR: --------------------- Error Message
>> -----------------------------------------[3]PETSC ERROR: Note: The EXACT line
>> numbers in the stack are not available,*//*
>> *//*[3]PETSC ERROR:       INSTEAD the line number of the start of the
>> function*//*
>> *//*[3]PETSC ERROR:       is given.*//*
>> *//*[3]PETSC ERROR: [3] F90Array3dCreate line 244
>> /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
>> *//*[3]PETSC ERROR: --------------------- Error Message
>> --------------------------------------------------------------*//*
>> *//*[3]PETSC ERROR: Signal received*//*
>> *//*[3]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> for trouble shooting.*//*
>> *//*[3]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
>> *//*[3]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036 by
>> Unknown Wed Jun  1 12:54:34 2016*//*
>> *//*[3]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
>> --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
>> --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
>> --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
>> --with-shared----------------------*//*
>> *//*[0]PETSC ERROR: Signal received*//*
>> *//*[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> for trouble shooting.*//*
>> *//*[0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
>> *//*[0]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036 by
>> Unknown Wed Jun  1 12:54:34 2016*//*
>> *//*[0]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
>> --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
>> --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
>> --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
>> --with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
>> --with-scalapack-lib=-SCALAPACK
>> --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
>> --with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 --with-hypre=1
>> --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
>> *//*[0]PETSC ERROR: #1 User provided function() line 0 in  unknown file*//*
>> *//*--------------------------------------------------------------------------*//*
>> *//*[m---------------------*//*
>> *//*[2]PETSC ERROR: Signal received*//*
>> *//*[2]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> for trouble shooting.*//*
>> *//*[2]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
>> *//*[2]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036 by
>> Unknown Wed Jun  1 12:54:34 2016*//*
>> *//*[2]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
>> --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
>> --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
>> --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
>> --with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
>> --with-scalapack-lib=-SCALAPACK
>> --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
>> --with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 --with-hypre=1
>> --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
>> *//*[2]PETSC ERROR: #1 User provided function() line 0 in  unknown file*//*
>> *//*--------------------------------------------------------------------------*//*
>> *//*[m[1]PETSC ERROR: --------------------- Error Message
>> --------------------------------------------------------------*//*
>> *//*[1]PETSC ERROR: Signal received*//*
>> *//*[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> for trouble shooting.*//*
>> *//*[1]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
>> *//*[1]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036 by
>> Unknown Wed Jun  1 12:54:34 2016*//*
>> *//*[1]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
>> --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
>> --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
>> --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
>> --with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
>> --with-scalapack-lib=-SCALAPACK
>> --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
>> --with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 --with-hypre=1
>> --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
>> *//*[1]PETSC ERROR: #1 User provided function() line 0 ilibraries=0
>> --with-blas-lapack-lib=-SSL2 --with-scalapack-lib=-SCALAPACK
>> --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
>> --with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 --with-hypre=1
>> --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
>> *//*[3]PETSC ERROR: #1 User provided function() line 0 in  unknown file*//*
>> *//*--------------------------------------------------------------------------*//*
>> *//*[mpi::mpi-api::mpi-abort]*//*
>> *//*MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD*//*
>> *//*with errorcode 59.*//*
>> *//*
>> *//*NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.*//*
>> *//*You may or may not see output from other processes, depending on*//*
>> *//*exactly when Open MPI kills them.*//*
>> *//*--------------------------------------------------------------------------*//*
>> *//*[b04-036:28416]
>> /opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(orte_errmgr_base_error_abort+0x84)
>> [0xffffffff11360404]*//*
>> *//*[b04-036:28416]
>> /opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(ompi_mpi_abort+0x51c)
>> [0xffffffff1110391c]*//*
>> *//*[b04-036:28416] /opt/FJSVtclang/GM-1.2.0-2pi::mpi-api::mpi-abort]*//*
>> *//*MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD*//*
>> *//*with errorcode 59.*/
>>
>> ex13f90 error:
>>
>>
>> /*[t00196 at b04-036 tutorials]$ mpiexec -np 2 ./ex13f90*//*
>> *//*jwe1050i-w The hardware barrier couldn't be used and continues processing
>> using the software barrier.*//*
>> *//*taken to (standard) corrective action, execution continuing.*//*
>> *//*jwe1050i-w The hardware barrier couldn't be used and continues processing
>> using the software barrier.*//*
>> *//*taken to (standard) corrective action, execution continuing.*//*
>> *//* Hi! We're solving van der Pol using  2 processes.*//*
>> *//*
>> *//*   t     x1         x2*//*
>> *//*[1]PETSC ERROR:
>> ------------------------------------------------------------------------*//*
>> *//*[1]PETSC ERROR: Caught signal number 10 BUS: Bus Error, possibly illegal
>> memory access*//*
>> *//*[1]PETSC ERROR: Try option -start_in_debugger or
>> -on_error_attach_debugger*//*
>> *//*[0]PETSC ERROR:
>> ------------------------------------------------------------------------*//*
>> *//*[0]PETSC ERROR: Caught signal number 10 BUS: Bus Error, possibly illegal
>> memory access*//*
>> *//*[0]PETSC ERROR: Try option -start_in_debugger or
>> -on_error_attach_debugger*//*
>> *//*[0]PETSC ERROR: or see
>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*
>> *//*[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
>> to find memory corruption errors*//*
>> *//*[1]PETSC ERROR: or see
>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*
>> *//*[1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
>> to find memory corruption errors*//*
>> *//*[1]PETSC ERROR: likely location of problem given in stack below*//*
>> *//*[1]PETSC ERROR: ---------------------  Stack Frames
>> ------------------------------------*//*
>> *//*[1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
>> available,*//*
>> *//*[1]PETSC ERROR:       INSTEAD the line number of the start of the
>> function*//*
>> *//*[1]PETSC ERROR:       is given.*//*
>> *//*[1]PETSC ERROR: [1] F90Array4dCreate line 337
>> /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
>> *//*[0]PETSC ERROR: likely location of problem given in stack below*//*
>> *//*[0]PETSC ERROR: ---------------------  Stack Frames
>> ------------------------------------*//*
>> *//*[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
>> available,*//*
>> *//*[0]PETSC ERROR:       INSTEAD the line number of the start of the
>> function*//*
>> *//*[0]PETSC ERROR:       is given.*//*
>> *//*[0]PETSC ERROR: [0] F90Array4dCreate line 337
>> /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
>> *//*[1]PETSC ERROR: --------------------- Error Message
>> --------------------------------------------------------------*//*
>> *//*[1]PETSC ERROR: Signal received*//*
>> *//*[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> for trouble shooting.*//*
>> *//*[1]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
>> *//*[1]PETSC ERROR: ./ex13f90 on a petsc-3.6.3_debug named b04-036 by Unknown
>> Wed Jun  1 13:04:34 2016*//*
>> *//*[1]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
>> --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
>> --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
>> --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
>> --with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
>> --with-scalapack-lib=-SCALAPACK
>> --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
>> --with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 --with-hypre=1
>> --with-hyp*//*
>> */
>>
>>
>>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160601/bd53ce52/attachment-0001.html>


More information about the petsc-users mailing list