[petsc-users] Error with PETSc on K computer

Satish Balay balay at mcs.anl.gov
Tue May 31 23:21:50 CDT 2016


Do PETSc examples using VecGetArrayF90() work?

say src/vec/vec/examples/tutorials/ex4f90.F

Satish

On Tue, 31 May 2016, TAY wee-beng wrote:

> Hi,
> 
> I'm trying to run my MPI CFD code on Japan's K computer. My code can run if I
> didn't make use of the PETSc DMDAVecGetArrayF90 subroutine. If it's called
> 
> call DMDAVecGetArrayF90(da_u,u_local,u_array,ierr)
> 
> I get the error below.  I have no problem with my code on other clusters using
> the new Intel compilers. I used to have problems with DM when using the old
> Intel compilers. Now on the K computer, I'm using Fujitsu's Fortran compiler.
> How can I troubleshoot?
> 
> Btw, I also tested on the ex13f90 example and it didn't work too. The error is
> below.
> 
> 
> My code error:
> 
> /* size_x,size_y,size_z 76x130x136*//*
> *//* total grid size =  1343680*//*
> *//* recommended cores (50k / core) =  26.87360000000000*//*
> *//* 0*//*
> *//* 1*//*
> *//* 1*//*
> *//*[3]PETSC ERROR: [1]PETSC ERROR:
> ------------------------------------------------------------------------*//*
> *//*[1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range*//*
> *//*[1]PETSC ERROR: Try option -start_in_debugger or
> -on_error_attach_debugger*//*
> *//*[1]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*
> *//*[1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
> to find memory corruption errors*//*
> *//*[1]PETSC ERROR: likely location of problem given in stack below*//*
> *//*[1]PETSC ERROR: ---------------------  Stack Frames
> ------------------------------------*//*
> *//*[1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,*//*
> *//*[1]PETSC ERROR:       INSTEAD the line number of the start of the
> function*//*
> *//*[1]PETSC ERROR:       is given.*//*
> *//*[1]PETSC ERROR: [1] F90Array3dCreate line 244
> /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
> *//* 1*//*
> *//*------------------------------------------------------------------------*//*
> *//*[3]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range*//*
> *//*[3]PETSC ERROR: Try option -start_in_debugger or
> -on_error_attach_debugger*//*
> *//*[3]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*
> *//*[3]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
> to find memory corruption errors*//*
> *//*[3]PETSC ERROR: likely location of problem given in stack below*//*
> *//*[3]PETSC ERROR: ---------------------  Stack Frames
> ------------------------------------*//*
> *//*[0]PETSC ERROR:
> ------------------------------------------------------------------------*//*
> *//*[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range*//*
> *//*[0]PETSC ERROR: Try option -start_in_debugger or
> -on_error_attach_debugger*//*
> *//*[0]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*
> *//*[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
> to find memory corruption errors*//*
> *//*[0]PETSC ERROR: likely location of problem given in stack below*//*
> *//*[0]PETSC ERROR: ---------------------  Stack Frames
> ------------------------------------*//*
> *//*[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,*//*
> *//*[0]PETSC ERROR:       INSTEAD the line number of the start of the
> function*//*
> *//*[0]PETSC ERROR:       is given.*//*
> *//*[0]PETSC ERROR: [0] F90Array3dCreate line 244
> /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
> *//*[0]PETSC ERROR: --------------------- Error Message
> ----------------------------------------- 1*//*
> *//*[2]PETSC ERROR:
> ------------------------------------------------------------------------*//*
> *//*[2]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range*//*
> *//*[2]PETSC ERROR: Try option -start_in_debugger or
> -on_error_attach_debugger*//*
> *//*[2]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*
> *//*[2]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
> to find memory corruption errors*//*
> *//*[2]PETSC ERROR: likely location of problem given in stack below*//*
> *//*[2]PETSC ERROR: ---------------------  Stack Frames
> ------------------------------------*//*
> *//*[2]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,*//*
> *//*[2]PETSC ERROR:       INSTEAD the line number of the start of the
> function*//*
> *//*[2]PETSC ERROR:       is given.*//*
> *//*[2]PETSC ERROR: [2] F90Array3dCreate line 244
> /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
> *//*[2]PETSC ERROR: --------------------- Error Message
> -----------------------------------------[3]PETSC ERROR: Note: The EXACT line
> numbers in the stack are not available,*//*
> *//*[3]PETSC ERROR:       INSTEAD the line number of the start of the
> function*//*
> *//*[3]PETSC ERROR:       is given.*//*
> *//*[3]PETSC ERROR: [3] F90Array3dCreate line 244
> /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
> *//*[3]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------*//*
> *//*[3]PETSC ERROR: Signal received*//*
> *//*[3]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.*//*
> *//*[3]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
> *//*[3]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036 by
> Unknown Wed Jun  1 12:54:34 2016*//*
> *//*[3]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
> --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
> --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
> --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
> --with-shared----------------------*//*
> *//*[0]PETSC ERROR: Signal received*//*
> *//*[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.*//*
> *//*[0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
> *//*[0]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036 by
> Unknown Wed Jun  1 12:54:34 2016*//*
> *//*[0]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
> --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
> --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
> --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
> --with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
> --with-scalapack-lib=-SCALAPACK
> --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
> --with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 --with-hypre=1
> --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
> *//*[0]PETSC ERROR: #1 User provided function() line 0 in  unknown file*//*
> *//*--------------------------------------------------------------------------*//*
> *//*[m---------------------*//*
> *//*[2]PETSC ERROR: Signal received*//*
> *//*[2]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.*//*
> *//*[2]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
> *//*[2]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036 by
> Unknown Wed Jun  1 12:54:34 2016*//*
> *//*[2]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
> --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
> --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
> --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
> --with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
> --with-scalapack-lib=-SCALAPACK
> --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
> --with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 --with-hypre=1
> --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
> *//*[2]PETSC ERROR: #1 User provided function() line 0 in  unknown file*//*
> *//*--------------------------------------------------------------------------*//*
> *//*[m[1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------*//*
> *//*[1]PETSC ERROR: Signal received*//*
> *//*[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.*//*
> *//*[1]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
> *//*[1]PETSC ERROR: ./a-debug.out on a petsc-3.6.3_debug named b04-036 by
> Unknown Wed Jun  1 12:54:34 2016*//*
> *//*[1]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
> --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
> --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
> --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
> --with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
> --with-scalapack-lib=-SCALAPACK
> --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
> --with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 --with-hypre=1
> --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
> *//*[1]PETSC ERROR: #1 User provided function() line 0 ilibraries=0
> --with-blas-lapack-lib=-SSL2 --with-scalapack-lib=-SCALAPACK
> --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
> --with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 --with-hypre=1
> --with-hypre-dir=/home/hp150306/t00196/lib/hypre-2.10.0b-p4*//*
> *//*[3]PETSC ERROR: #1 User provided function() line 0 in  unknown file*//*
> *//*--------------------------------------------------------------------------*//*
> *//*[mpi::mpi-api::mpi-abort]*//*
> *//*MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD*//*
> *//*with errorcode 59.*//*
> *//*
> *//*NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.*//*
> *//*You may or may not see output from other processes, depending on*//*
> *//*exactly when Open MPI kills them.*//*
> *//*--------------------------------------------------------------------------*//*
> *//*[b04-036:28416]
> /opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(orte_errmgr_base_error_abort+0x84)
> [0xffffffff11360404]*//*
> *//*[b04-036:28416]
> /opt/FJSVtclang/GM-1.2.0-20/lib64/libmpi.so.0(ompi_mpi_abort+0x51c)
> [0xffffffff1110391c]*//*
> *//*[b04-036:28416] /opt/FJSVtclang/GM-1.2.0-2pi::mpi-api::mpi-abort]*//*
> *//*MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD*//*
> *//*with errorcode 59.*/
> 
> ex13f90 error:
> 
> 
> /*[t00196 at b04-036 tutorials]$ mpiexec -np 2 ./ex13f90*//*
> *//*jwe1050i-w The hardware barrier couldn't be used and continues processing
> using the software barrier.*//*
> *//*taken to (standard) corrective action, execution continuing.*//*
> *//*jwe1050i-w The hardware barrier couldn't be used and continues processing
> using the software barrier.*//*
> *//*taken to (standard) corrective action, execution continuing.*//*
> *//* Hi! We're solving van der Pol using  2 processes.*//*
> *//*
> *//*   t     x1         x2*//*
> *//*[1]PETSC ERROR:
> ------------------------------------------------------------------------*//*
> *//*[1]PETSC ERROR: Caught signal number 10 BUS: Bus Error, possibly illegal
> memory access*//*
> *//*[1]PETSC ERROR: Try option -start_in_debugger or
> -on_error_attach_debugger*//*
> *//*[0]PETSC ERROR:
> ------------------------------------------------------------------------*//*
> *//*[0]PETSC ERROR: Caught signal number 10 BUS: Bus Error, possibly illegal
> memory access*//*
> *//*[0]PETSC ERROR: Try option -start_in_debugger or
> -on_error_attach_debugger*//*
> *//*[0]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*
> *//*[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
> to find memory corruption errors*//*
> *//*[1]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind*//*
> *//*[1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
> to find memory corruption errors*//*
> *//*[1]PETSC ERROR: likely location of problem given in stack below*//*
> *//*[1]PETSC ERROR: ---------------------  Stack Frames
> ------------------------------------*//*
> *//*[1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,*//*
> *//*[1]PETSC ERROR:       INSTEAD the line number of the start of the
> function*//*
> *//*[1]PETSC ERROR:       is given.*//*
> *//*[1]PETSC ERROR: [1] F90Array4dCreate line 337
> /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
> *//*[0]PETSC ERROR: likely location of problem given in stack below*//*
> *//*[0]PETSC ERROR: ---------------------  Stack Frames
> ------------------------------------*//*
> *//*[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,*//*
> *//*[0]PETSC ERROR:       INSTEAD the line number of the start of the
> function*//*
> *//*[0]PETSC ERROR:       is given.*//*
> *//*[0]PETSC ERROR: [0] F90Array4dCreate line 337
> /.global/volume2/home/hp150306/t00196/source/petsc-3.6.3/src/sys/f90-src/f90_cwrap.c*//*
> *//*[1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------*//*
> *//*[1]PETSC ERROR: Signal received*//*
> *//*[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.*//*
> *//*[1]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015*//*
> *//*[1]PETSC ERROR: ./ex13f90 on a petsc-3.6.3_debug named b04-036 by Unknown
> Wed Jun  1 13:04:34 2016*//*
> *//*[1]PETSC ERROR: Configure options --with-cc=mpifcc --with-cxx=mpiFCC
> --with-fc=mpifrt --with-64-bit-pointers=1 --CC=mpifcc --CFLAGS="-Xg -O0"
> --CXX=mpiFCC --CXXFLAGS="-Xg -O0" --FC=mpifrt --FFLAGS="-X9 -O0" --LD_SHARED=
> --LDDFLAGS= --with-openmp=1 --with-mpiexec=mpiexec --known-endian=big
> --with-shared-libraries=0 --with-blas-lapack-lib=-SSL2
> --with-scalapack-lib=-SCALAPACK
> --prefix=/home/hp150306/t00196/lib/petsc-3.6.3_debug
> --with-fortran-interfaces=1 --with-debugging=1 --useThreads=0 --with-hypre=1
> --with-hyp*//*
> */
> 
> 
> 



More information about the petsc-users mailing list