[petsc-users] Segmentation Violation in a very simple fortran code

Valerio Grazioso graziosov at me.queensu.ca
Thu Jun 17 23:19:57 CDT 2010


Hi, I'm stuck with an implementation of a 3D Poisson solver with PETSc in fortran90.
I was getting strange Segmentation Violation and after some debugging I realized that the the problem was in the vector created with 

DACreateGlobalVector() 

So I've written a simple code:

****************************************************************************

program PetscTest

implicit none

Vec q
PetscScalar alpha
DA da
PetscErrorCode err
PetscInt     i3,i1

i3=4
i1=1

call DACreate3d(PETSC_COMM_WORLD,DA_NONPERIODIC,DA_STENCIL_STAR,i3,i3,i3,PETSC_DECIDE,PETSC_DECIDE,PETSC_DECIDE,i1,i1, & 
                PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,da,err)

call DACreateGlobalVector(da,q,err)

alpha=1.0

call VecSet(q,alpha,err)
call vecView(q,PETSC_VIEWER_STDOUT_WORLD)

call VecDestroy(q,err);

call PetscFinalize(PETSC_NULL_CHARACTER,err)

end program 

****************************************************************************


 And this is the output that i get if I run the executable with 

mpirun -n 2 ./PetscTest -da_view

****************************************************************************

Processor [0] M 4 N 4 P 4 m 1 n 1 p 2 w 1 s 1
X range of indices: 0 4, Y range of indices: 0 4, Z range of indices: 0 2
Processor [1] M 4 N 4 P 4 m 1 n 1 p 2 w 1 s 1
X range of indices: 0 4, Y range of indices: 0 4, Z range of indices: 2 4
Process [0]
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
Process [1]
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[1]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[1]PETSC ERROR: likely location of problem given in stack below
[1]PETSC ERROR: ---------------------  Stack Frames ------------------------------------
[1]PETSC ERROR: [0]PETSC ERROR: ------------------------------------------------------------------------
Note: The EXACT line numbers in the stack are not available,
[1]PETSC ERROR:       INSTEAD the line number of the start of the function
[1]PETSC ERROR:       is given.
[1]PETSC ERROR: --------------------- Error Message ------------------------------------
[1]PETSC ERROR: Signal received!
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: Petsc Release Version 3.1.0, Patch 3, Fri Jun  4 15:34:52 CDT 2010
[1]PETSC ERROR: See docs/changes/index.html for recent updates.
[1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[1]PETSC ERROR: See docs/index.html for manual pages.
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: ./PetscTest on a linux-gnu named up0001 by hpc2231 Thu Jun 17 23:58:50 2010
[1]PETSC ERROR: Libraries linked from /home/hpc2231/lib/petsc-3.1-p3/linux-gnu-c-debug/lib
[1]PETSC ERROR: Configure run at Thu Jun 17 23:00:54 2010
[1]PETSC ERROR: Configure options LIBS="-limf -lm" --download-hypre=1 --with-fortran-interfaces=1
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: --------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD 
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
likely location of problem given in stack below
[0]PETSC ERROR: ---------------------  Stack Frames ------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR:       INSTEAD the line number of the start of the function
[0]PETSC ERROR:       is given.
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Signal received!
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 3, Fri Jun  4 15:34:52 CDT 2010
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: ./PetscTest on a linux-gnu named up0001 by hpc2231 Thu Jun 17 23:58:50 2010
[0]PETSC ERROR: Libraries linked from /home/hpc2231/lib/petsc-3.1-p3/linux-gnu-c-debug/lib
[0]PETSC ERROR: Configure run at Thu Jun 17 23:00:54 2010
[0]PETSC ERROR: Configure options LIBS="-limf -lm" --download-hypre=1 --with-fortran-interfaces=1
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
--------------------------------------------------------------------------
mpirun has exited due to process rank 1 with PID 10883 on
node up0001 exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[up0001:10881] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[up0001:10881] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

****************************************************************************

If been looking trough the troubleshooting and FAQ with no success and now I'm stuck ....

Any ideas or suggestions 

Thanks

Valerio 






 
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20100618/62c32b69/attachment-0001.htm>


More information about the petsc-users mailing list