[petsc-users] Newbie question: Strange failure when calling PetscIntView from slepc application

Stefano Zampini stefano.zampini at gmail.com
Fri Apr 9 02:46:07 CDT 2021


As the error message says, use valgrind https://www.valgrind.org/ <https://www.valgrind.org/> to catch these kind of issues

> On Apr 9, 2021, at 10:43 AM, dazza simplythebest <sayosale at hotmail.com> wrote:
> 
> Dear All,
>               I am getting a puzzling 'Segmentation Violation' error when I try to
> write out an integer array using PetscIntView in a Fortran code. I have written the small 
> code below which reproduces the problem. All this code does is create
>  a PetscInt array, initialise this array, then try to write it out to screen.
> Interestingly PetscIntView does seem to correctly write out all the values to
> the screen (which agree with a direct write), but then fails before it can return to
> the main program (see output pasted in below).
> 
> I think I must be doing something quite silly, but just 
> can't quite see what it is! Any suggestions will be very welcome.
>   Many thanks,
>                    Dan
>   
> Code:
> 
>       MODULE ALL_STAB_ROUTINES
>       IMPLICIT NONE
>       CONTAINS
> 
>       SUBROUTINE  WRITE_ROWS_TO_PETSC_MATRIX( ISIZE, JALOC)
> #include <slepc/finclude/slepceps.h>
>       use slepceps
>       IMPLICIT NONE
>       PetscInt, INTENT (IN) ::  ISIZE
>       PetscInt, INTENT(INOUT), DIMENSION(0:ISIZE-1)  :: JALOC
>       PetscErrorCode   :: ierr
> 
>       write(*,*)'check 02: ',shape(jaloc),lbound(jaloc),ubound(jaloc)
>       write(*,*)jaloc
> 
>       write(*,*)'now for PetscIntView ...'
>       call PetscIntView(ISIZE,JALOC, PETSC_VIEWER_STDOUT_WORLD)
>       CHKERRA(ierr)
> 
>       END SUBROUTINE WRITE_ROWS_TO_PETSC_MATRIX
> 
>       END MODULE ALL_STAB_ROUTINES    
>      
>       program  stabbo
>       USE  MPI
> #include <slepc/finclude/slepceps.h>
>       use slepceps
>       USE ALL_STAB_ROUTINES
>       IMPLICIT NONE    
>       PetscInt, ALLOCATABLE, DIMENSION(:) :: JALOC
>       PetscInt, PARAMETER ::  ISIZE = 10
>       PetscInt, parameter ::  FOUR=4
>       PetscErrorCode   :: ierr_pets
>       call SlepcInitialize(PETSC_NULL_CHARACTER,ierr_pets)
>                
>       ALLOCATE(JALOC(0:ISIZE-1))
>       JALOC = FOUR
>       write(*,*)'check 01: ',shape(jaloc),lbound(jaloc),ubound(jaloc)
>       CALL WRITE_ROWS_TO_PETSC_MATRIX(ISIZE, JALOC)
>       CALL SlepcFinalize(ierr_pets)
>       END PROGRAM STABBO   
> 
> Output:
> 
> dan at super01 /data/work/rotplane/omega_to_zero/stability/test/tmp10/tmp3 $ mpiexec.hydra -n 1 ./trashy.exe
>  check 01:           10           0           9
>  check 02:           10           0           9
>                      4                     4                     4
>                      4                     4                     4
>                      4                     4                     4
>                      4
>  now for PetscIntView ...
> 0: 4 4 4 4 4 4 4 4 4 4
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind <https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind>
> [0]PETSC ERROR: or try http://valgrind.org <http://valgrind.org/> on GNU/linux and Apple Mac OS X to find memory corruption errors
> [0]PETSC ERROR: likely location of problem given in stack below
> [0]PETSC ERROR: ---------------------  Stack Frames ------------------------------------
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
> [0]PETSC ERROR:       INSTEAD the line number of the start of the function
> [0]PETSC ERROR:       is given.
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [0]PETSC ERROR: Signal received
> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html <https://www.mcs.anl.gov/petsc/documentation/faq.html> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.14.5, Mar 03, 2021
> [0]PETSC ERROR: ./trashy.exe on a  named super01 by darren Fri Apr  9 16:28:25 2021
> [0]PETSC ERROR: Configure options --package-prefix-hash=/home/darren/petsc-hash-pkgs --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort --with-mpiexec=mpiexec.hydra COPTFLAGS="-g -O" FOPTFLAGS="-g -O" CXXOPTFLAGS="-g -O" --with-64-bit-indices=1 --with-scalar-type=complex --with-precision=double --with-debugging=1 --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2020.0.166/linux/mkl --with-mkl_pardiso-dir=/opt/intel/compilers_and_libraries_2020.0.166/linux/mkl --with-mkl_cpardiso-dir=/opt/intel/compilers_and_libraries_2020.0.166/linux/mkl --download-mumps --download-scalapack --download-cmake PETSC_ARCH=arch-ci-linux-intel-mkl-cmplx-ilp64-dbg-ftn-with-external
> [0]PETSC ERROR: #1 User provided function() line 0 in  unknown file
> [0]PETSC ERROR: Checking the memory for corruption.
> Abort(50176059) on node 0 (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 50176059) - process 0

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210409/e03348fc/attachment-0001.html>


More information about the petsc-users mailing list