[petsc-users] vecview problem

Barry Smith bsmith at mcs.anl.gov
Mon Jul 9 09:01:13 CDT 2012


   There is no such viewer as PETSC_VIEWER_DEFAULT (you just got lucky it didn't crash in the first call). Maybe you want PETSC_VIEWER_STDOUT_WORLD?

   Barry

PETSC_VIEWER_DEFAULT is for setting the particular format a viewer uses.

On Jul 9, 2012, at 8:50 AM, Klaij, Christiaan wrote:

> I'm having a segmentation fault with vecview in fortran90
> with this program. Am I doing something wrong?
> 
> $ cat bug.F90
> module bug
> 
>  use petscksp
>  implicit none
> #include "finclude/petsckspdef.h"
> 
>  PetscErrorCode, public :: ierr
>  Vec, private :: x
> 
>  public bugSubroutine
> 
> contains
> 
>  subroutine bugSubroutine()
>    call setVector()
>    call VecView(x,PETSC_VIEWER_DEFAULT,ierr); CHKERRQ(ierr)
>  end subroutine bugSubroutine
> 
>  subroutine setVector()
>    call VecCreate(PETSC_COMM_WORLD,x,ierr); CHKERRQ(ierr)
>    call VecSetSizes(x,PETSC_DECIDE,5,ierr); CHKERRQ(ierr)
>    call VecSetType(x,VECMPI,ierr); CHKERRQ(ierr)
>    call VecView(x,PETSC_VIEWER_DEFAULT,ierr); CHKERRQ(ierr)
>  end subroutine setVector
> 
> end module bug
> 
> program testBug
> 
>  use bug
>  use petscksp
>  implicit none
> #include "finclude/petsckspdef.h"
> 
>  call PetscInitialize(PETSC_NULL_CHARACTER,ierr)
>  call bugSubroutine();
>  call PetscFinalize(ierr)
> 
> end program testBug
> 
> $ mpiexec -n 1 ./bug
> Vector Object: 1 MPI processes
>  type: mpi
> Process [0]
> 0
> 0
> 0
> 0
> 0
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
> [0]PETSC ERROR: likely location of problem given in stack below
> [0]PETSC ERROR: ---------------------  Stack Frames ------------------------------------
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
> [0]PETSC ERROR:       INSTEAD the line number of the start of the function
> [0]PETSC ERROR:       is given.
> [0]PETSC ERROR: [0] VecView line 747 /home/CKlaij/ReFRESCO/Dev/trunk/Libs/build/petsc-3.3-p1/src/vec/vec/interface/vector.c
> [0]PETSC ERROR: --------------------- Error Message ------------------------------------
> [0]PETSC ERROR: Signal received!
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 1, Fri Jun 15 09:30:49 CDT 2012
> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: ./bug on a linux_64b named lin0133 by cklaij Mon Jul  9 15:46:34 2012
> [0]PETSC ERROR: Libraries linked from /home/CKlaij/ReFRESCO/Dev/trunk/Libs/build/petsc-3.3-p1/linux_64bit_debug/lib
> [0]PETSC ERROR: Configure run at Wed Jun 20 12:08:20 2012
> [0]PETSC ERROR: Configure options --with-mpi-dir=/opt/refresco/libraries_cklaij/openmpi-1.4.5 --with-clanguage=c++ --with-x=1 --with-debugging=1 --with-hypre-include=/opt/refresco/libraries_cklaij/hypre-2.7.0b/include --with-hypre-lib=/opt/refresco/libraries_cklaij/hypre-2.7.0b/lib/libHYPRE.a --with-ml-include=/opt/refresco/libraries_cklaij/ml-6.2/include --with-ml-lib=/opt/refresco/libraries_cklaij/ml-6.2/lib/libml.a --with-blas-lapack-dir=/opt/intel/mkl
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 59.
> 
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> 
> (gdb) run
> Starting program: /home/CKlaij/Programming/PETSc/Laplace/bug
> [Thread debugging using libthread_db enabled]
> Vector Object: 1 MPI processes
>  type: mpi
> Process [0]
> 0
> 0
> 0
> 0
> 0
> 
> Program received signal SIGSEGV, Segmentation fault.
> 0x000000000043f7ee in VecView (vec=0xd8b1f0, viewer=0x69706d00000000)
>    at /home/CKlaij/ReFRESCO/Dev/trunk/Libs/build/petsc-3.3-p1/src/vec/vec/interface/vector.c:753
> 753       PetscValidHeaderSpecific(viewer,PETSC_VIEWER_CLASSID,2);
> (gdb) bt
> #0  0x000000000043f7ee in VecView (vec=0xd8b1f0, viewer=0x69706d00000000)
>    at /home/CKlaij/ReFRESCO/Dev/trunk/Libs/build/petsc-3.3-p1/src/vec/vec/interface/vector.c:753
> #1  0x00000000004319e8 in vecview_ (x=0xb56e20, vin=0x822788, ierr=0xb56e38)
>    at /home/CKlaij/ReFRESCO/Dev/trunk/Libs/build/petsc-3.3-p1/src/vec/vec/interface/ftn-custom/zvectorf.c:56
> #2  0x000000000042c936 in bug_mp_bugsubroutine_ ()
> #3  0x000000000042cafb in testbug () at bug.F90:37
> 
> 
> dr. ir. Christiaan Klaij
> CFD Researcher
> Research & Development
> E mailto:C.Klaij at marin.nl
> T +31 317 49 33 44
> 
> MARIN
> 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands
> T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl
> 



More information about the petsc-users mailing list