[petsc-users] KSPSetUp with PETSc/MUMPS
Matthew Knepley
knepley at gmail.com
Thu May 26 06:44:59 CDT 2016
Usually this means you have an uninitialized variable that is causing you
to overwrite memory. Fortran
is so lax in checking this, its one reason to switch to C.
Thanks,
Matt
On Thu, May 26, 2016 at 1:46 AM, Constantin Nguyen Van <
constantin.nguyen.van at openmailbox.org> wrote:
> Thanks for all your answers.
> I'm sorry for the syntax mistake in MatLoad, it was done afterwards.
>
> I recompile PETSC --with-debugging=yes and run my code again.
> Now, I also have this strange behaviour. When I run the code without
> valgrind and with one proc, I have this error message:
>
> BEGIN PROC 0
> ITERATION 1
> ECHO 1
> ECHO 2
> INFOG(28): 2
> BASIS OK 0
> END PROC 0
> BEGIN PROC 0
> ITERATION 2
> ECHO 1
> ECHO 2
> INFOG(28): 2
> BASIS OK 0
> END PROC 0
> BEGIN PROC 0
> ITERATION 3
> ECHO 1
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS
> X to find memory corruption errors
> [0]PETSC ERROR: likely location of problem given in stack below
> [0]PETSC ERROR: --------------------- Stack Frames
> ------------------------------------
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,
> [0]PETSC ERROR: INSTEAD the line number of the start of the function
> [0]PETSC ERROR: is given.
> [0]PETSC ERROR: [0] MatGetRowIJ_SeqAIJ_Inode_Symmetric line 69
> /home/j10077/librairie/petsc-mumps/petsc-3.6.4/src/mat/impls/aij/seq/inode.c
> [0]PETSC ERROR: [0] MatGetRowIJ_SeqAIJ_Inode line 235
> /home/j10077/librairie/petsc-mumps/petsc-3.6.4/src/mat/impls/aij/seq/inode.c
> [0]PETSC ERROR: [0] MatGetRowIJ line 7099
> /home/j10077/librairie/petsc-mumps/petsc-3.6.4/src/mat/interface/matrix.c
> [0]PETSC ERROR: [0] MatGetOrdering_ND line 17
> /home/j10077/librairie/petsc-mumps/petsc-3.6.4/src/mat/order/spnd.c
> [0]PETSC ERROR: [0] MatGetOrdering line 185
> /home/j10077/librairie/petsc-mumps/petsc-3.6.4/src/mat/order/sorder.c
> [0]PETSC ERROR: [0] MatGetOrdering line 185
> /home/j10077/librairie/petsc-mumps/petsc-3.6.4/src/mat/order/sorder.c
> [0]PETSC ERROR: [0] PCSetUp_LU line 99
> /home/j10077/librairie/petsc-mumps/petsc-3.6.4/src/ksp/pc/impls/factor/lu/lu.c
> [0]PETSC ERROR: [0] PCSetUp line 945
> /home/j10077/librairie/petsc-mumps/petsc-3.6.4/src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: [0] KSPSetUp line 247
> /home/j10077/librairie/petsc-mumps/petsc-3.6.4/src/ksp/ksp/interface/itfunc.c
>
> But when I run it with valgrind, it does work well.
>
> Le 2016-05-25 20:04, Barry Smith a écrit :
>
>> First run with valgrind
>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
>>
>> On May 25, 2016, at 2:35 AM, Constantin Nguyen Van
>>> <constantin.nguyen.van at openmailbox.org> wrote:
>>>
>>> Hi,
>>>
>>> I'm a new user of PETSc and I try to use it with MUMPS
>>> functionalities to compute a nullbasis.
>>> I wrote a code where I compute 4 times the same nullbasis. It does
>>> work well when I run it with several procs but with only one
>>> processor I get an error on the 2nd iteration when KSPSetUp is
>>> called. Furthermore when it is run with a debugger (
>>> --with-debugging=yes), it works fine with one or several processors.
>>> Have you got any idea about why it doesn't work with one processor
>>> and no debugger?
>>>
>>> Thanks.
>>> Constantin.
>>>
>>> PS: You can find the code and the files required to run it enclosed.
>>>
>>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160526/c72b84b2/attachment-0001.html>
More information about the petsc-users
mailing list