[petsc-users] errors when repeatedly using MatSetValues and MatAssembly in a loop
Shaohao Chen
shchen at www.phys.lsu.edu
Thu May 16 11:52:31 CDT 2013
Dear writers and users of PETSc,
I received errors as attached. It seems that I over allocated memory, so that the system killed my
job. I repeatedly used MatSetValues and MatAssembly in a big loop. This error massage appears
during this loop. The structure of my codes is as following. Could you please give me some hints
where I could over allocate memory? Thanks!
Structure of my codes:
Mat A;
MatCreate(A);
MatSetSizes(A);
MatSetUp(A);
MatSetValues(A); // set initio values
MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY);
MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY);
MatSetOption(A,MAT_NEW_NONZERO_LOCATIONS,PETSC_FALSE); // fix nonzero structure for all
uses below
------ begin loop -----
...
MatSetValues(A); // update values of some parts of the matrix
MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); // the error massage appears here, after tens of
steps of the loop.
MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY);
KSPSolve(A,...);
.
------ end loop -----
==== attached errors =====
[0]PETSC ERROR: ---------------------------------------------------------------------
---
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access
out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC
ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption
errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: --------------------- Stack Frames ------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR: INSTEAD the line number of the start of the function
[0]PETSC ERROR: is given.
[0]PETSC ERROR: [0] MatStashScatterGetMesg_Private line 617 src/mat/utils/matstash.c
[0]PETSC ERROR: [0] MatAssemblyEnd_MPIAIJ line 673 src/mat/impls/aij/mpi/mpiaij.c
[0]PETSC ERROR: [0] MatAssemblyEnd line 4930 src/mat/interface/matrix.c
[0]PETSC ERROR: --------------------- Error Message -----------------------------------
-
[0]PETSC ERROR: Signal received!
[0]PETSC ERROR: ---------------------------------------------------------------------
---
[0]PETSC ERROR: Petsc Release Version 3.4.0, May, 13, 2013
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ---------------------------------------------------------------------
---
[0]PETSC ERROR: /home/shaohao/program-basis/ais/ais-tdse on a arch-linux2-c-debug named
qb002 by shaohao Thu May 16 10:55:09 2013
[0]PETSC ERROR: Libraries linked from /usr/local/packages/petsc/3.4.0/intel-11.1-mvapich-
1.1/lib
[0]PETSC ERROR: Configure run at Tue May 14 14:20:15 2013
[0]PETSC ERROR: Configure options --prefix=/usr/local/packages/petsc/3.4.0/intel-11.1-
mvapich-1.1 --with-mpi=1 --with-mpi-compilers=1 --with-c-support=1 --with-fortran=1 --
with-c++-support=1 --with-lapack-lib=/usr/local/packages/lapack/3.4.2/intel-
11.1/lib/liblapack.a --with-blas-lib=/usr/local/packages/lapack/3.4.2/intel-11.1/lib/libblas.a --
with-expat=1 --with-expat-dir=/usr
[0]PETSC ERROR: ---------------------------------------------------------------------
---
[0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
[0] [MPI Abort by user] Aborting Program!
Abort signaled by rank 0: MPI Abort by user Aborting program !
Exit code -3 signaled from qb002
Killing remote processes...MPI process terminated unexpectedly
DONE
--
Shaohao Chen
Department of Physics & Astronomy,
Louisiana State University,
Baton Rouge, LA
More information about the petsc-users
mailing list