[petsc-users] MG Preconditioning

Jed Brown jedbrown at mcs.anl.gov
Fri Aug 31 23:29:45 CDT 2012


On Fri, Aug 31, 2012 at 11:22 PM, Gaetan Kenway <kenway at utias.utoronto.ca>wrote:

> Hi Again
>
> I also tried petsc-3.2 version and I still get the same backtrace.
>
> If its not possible to figure out where the communicator segfault is
> coming from its not a huge deal...I've just set the option using
> PetscOptionsSetValue() and then use PCSetFromOptions() to pull it back out.
>  That seems to work fine.
>
> Even avoiding the above problem, with the PetscOptionsSetValue I'm still
> receiving an error code 73 when I run the multigrid solver.  I've included
> the backtrace output below but its not a lot of help since the code exited
> cleaning using my error checking procedure
>
>  =================================================================
> PETSc or MPI Error. Error Code 73. Detected on Proc  0
> Error at line:   122 in file: solveADjointTransposePETSc.F90
>  =================================================================
>
> Program received signal SIGSEGV, Segmentation fault.
> 0xb68c98f0 in PMPI_Abort () from /usr/local/lib/libmpi.so.0
> (gdb) bt
> #0  0xb68c98f0 in PMPI_Abort () from /usr/local/lib/libmpi.so.0
> #1  0xb519f190 in pmpi_abort__ () from /usr/local/lib/libmpi_f77.so.0
> #2  0xb44c9e4c in echk (ierr=@0x49, file=..., line=@0x7a,
> .tmp.FILE.len_V$eb=30) at terminate.f90:154
> #3  0xb44bd68f in solveadjointtransposepetsc () at
> solveADjointTransposePETSc.F90:122
> #4  0xb44138a9 in f2py_rout_sumb_solveadjointtransposepetsc () from
> /tmp/tmpKYF_DT/sumb.so
> #5  0xb440fd35 in fortran_call () from /tmp/tmpKYF_DT/sumb.so
> #6  0x0805fd6a in PyObject_Call ()
> #7  0x080dd5b0 in PyEval_EvalFrameEx ()
> #8  0x080dfbb2 in PyEval_EvalCodeEx ()
> #9  0x08168f1f in ?? ()
> #10 0x0805fd6a in PyObject_Call ()
> #11 0x080dcbeb in PyEval_EvalFrameEx ()
> #12 0x080dfbb2 in PyEval_EvalCodeEx ()
> #13 0x080de145 in PyEval_EvalFrameEx ()
> #14 0x080dfbb2 in PyEval_EvalCodeEx ()
> #15 0x080dfca7 in PyEval_EvalCode ()
> #16 0x080fd956 in PyRun_FileExFlags ()
> #17 0x080fdbb2 in PyRun_SimpleFileExFlags ()
> #18 0x0805b6d3 in Py_Main ()
> #19 0x0805a8ab in main ()
>

This stack doesn't involve PETSc at all.


>
> Valgrid was clean right up until the end where I get the normal error
> message:
>
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try
> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory
> corruption errors
> [0]PETSC ERROR: likely location of problem given in stack below
> [0]PETSC ERROR: ---------------------  Stack Frames
> ------------------------------------
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,
> [0]PETSC ERROR:       INSTEAD the line number of the start of the function
> [0]PETSC ERROR:       is given.
> [0]PETSC ERROR: [0] MatGetVecs line 8142
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/mat/interface/matrix.c
> [0]PETSC ERROR: [0] KSPGetVecs line 774
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/iterativ.c
> [0]PETSC ERROR: [0] PCSetUp_MG line 508
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/impls/mg/mg.c
> [0]PETSC ERROR: [0] PCSetUp line 810
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: [0] KSPSetUp line 182
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: [0] KSPSolve line 351
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/itfunc.c
>
> I will try it on a different system tomorrow to see if I have any more
> luck.
>
> Thanks,
>
> Gaetan
>
>
>
> On Fri, Aug 31, 2012 at 11:08 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
>
>> On Fri, Aug 31, 2012 at 10:06 PM, Matthew Knepley <knepley at gmail.com>wrote:
>>
>>> True, but the backtrace also shows that comm = 0x0 on the call to
>>> KSPCreate(), which
>>> leads me to believe that your petsc4py has not initialized PETSc, and
>>> therefor not
>>> initialized PETSC_COMM_WORLD.
>>>
>>
>> Could catch, maybe PetscFunctionBegin() should check that PETSc has been
>> initialized.
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120831/feed7739/attachment.html>


More information about the petsc-users mailing list