[petsc-users] MG Preconditioning
Gaetan Kenway
kenway at utias.utoronto.ca
Fri Aug 31 22:06:14 CDT 2012
Running the simple code through valgrind yields the only non-python related
error below:
==23697== Invalid read of size 4
==23697== at 0x6592465: PMPI_Attr_get (in /usr/local/lib/libmpi.so.0.0.2)
==23697== by 0x8BED7C8: PetscCommDuplicate (tagm.c:145)
==23697== by 0x8BF2EF6: PetscHeaderCreate_Private (inherit.c:51)
==23697== by 0x958F7C9: KSPCreate (itcreate.c:564)
==23697== by 0x93DAC39: PCMGSetLevels (mg.c:219)
==23697== by 0x93BC73B: pcmgsetlevels_ (zmgf.c:15)
==23697== by 0x7A5CE1C: mgtest_ (mgtest.F:15)
==23697== Address 0x8c is not stack'd, malloc'd or (recently) free'd
==23697==
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC ERROR:
or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory
corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: --------------------- Stack Frames
------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR: INSTEAD the line number of the start of the function
[0]PETSC ERROR: is given.
[0]PETSC ERROR: [0] PetscCommDuplicate line 144
/nfs/mica/home/kenway/Downloads/petsc-3.3/src/sys/objects/tagm.c
[0]PETSC ERROR: [0] PetscHeaderCreate_Private line 30
/nfs/mica/home/kenway/Downloads/petsc-3.3/src/sys/objects/inherit.c
[0]PETSC ERROR: [0] KSPCreate line 557
/nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/itcreate.c
[0]PETSC ERROR: [0] PCMGSetLevels line 180
/nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/impls/mg/mg.c
Gaetan
On Fri, Aug 31, 2012 at 11:03 PM, Gaetan Kenway <kenway at utias.utoronto.ca>wrote:
> Hi again
>
> I distilled the problem down further to a simple subroutine in mgtest.F,
> copied directly from ex8f.F
>
> subroutine MGTest()
> implicit none
>
> #include "include/finclude/petsc.h"
>
> KSP ksp
> PC pc
> PetscErrorCode ierr
> PetscInt nlevels, two
> two = 2
>
> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr)
> call KSPGetPC(ksp,pc,ierr)
> call PCSetType(pc,PCMG,ierr)
> call PCMGSetLevels(pc,two,PETSC_NULL_OBJECT,ierr)
>
> call KSPDestroy(ksp)
>
> end subroutine MGTest
>
> When I ran this simple case with gdb I got the following backtrace:
>
> #0 0xb694d465 in PMPI_Attr_get () from /usr/local/lib/libmpi.so.0
> #1 0xb555d7c9 in PetscCommDuplicate (comm_in=0x0, comm_out=0x88d7378,
> first_tag=0x88d73a0) at
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/sys/objects/tagm.c:145
> #2 0xb5562ef7 in PetscHeaderCreate_Private (h=0x88d7370, classid=1211235,
> type=-1, class_name=0xb61cfd70 "KSP", descr=0xb61cfd74 "Krylov Method",
> mansec=0xb61cfd70 "KSP", comm=0x0,
> des=0xb5eeac70 <KSPDestroy>, vie=0xb5efafd7 <KSPView>) at
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/sys/objects/inherit.c:51
> #3 0xb5eff7ca in KSPCreate (comm=0x0, inksp=0x88d6ee0) at
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/itcreate.c:564
> #4 0xb5d4ac3a in PCMGSetLevels (pc=0x89089a0, levels=2, comms=0xb4bbf5a0)
> at /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/impls/mg/mg.c:219
> #5 0xb5d2c73c in pcmgsetlevels_ (pc=0xbfffec98, levels=0xbfffec8c,
> comms=0xb4bbf5a0, ierr=0xbfffec94) at
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/impls/mg/ftn-custom/zmgf.c:15
> #6 0xb4affe1d in mgtest () at mgtest.F:15
> #7 0xb4aad769 in f2py_rout_warp_mgtest (capi_self=0x8618758,
> capi_args=0xb7c3002c, capi_keywds=0x0, f2py_func=0xb4affdc0 <mgtest>) at
> warpmodule.c:4273
> #8 0xb4aa58cf in fortran_call (fp=0x8618758, arg=0xb7c3002c, kw=0x0) at
> /usr/lib/python2.6/dist-packages/numpy/f2py/src/fortranobject.c:318
> #9 0x0805fd6a in PyObject_Call ()
> #10 0x080dd5b0 in PyEval_EvalFrameEx ()
> #11 0x080dfbb2 in PyEval_EvalCodeEx ()
> #12 0x080dfca7 in PyEval_EvalCode ()
> #13 0x080fd956 in PyRun_FileExFlags ()
> #14 0x080fdbb2 in PyRun_SimpleFileExFlags ()
> #15 0x0805b6d3 in Py_Main ()
> #16 0x0805a8ab in main ()
>
> The only difference is that I'm calling the routine from python and petsc
> is initialized from petsc4py.
>
> Any ideas?
>
> Thanks,
> Gaetan
>
>
> On Fri, Aug 31, 2012 at 9:48 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
>
>> src/ksp/pc/examples/tests/ex8f.F is a simple example, but I made the
>> modification below and it runs fine.
>>
>> diff --git a/src/ksp/pc/examples/tests/ex8f.F
>> b/src/ksp/pc/examples/tests/ex8f.F
>> --- a/src/ksp/pc/examples/tests/ex8f.F
>> +++ b/src/ksp/pc/examples/tests/ex8f.F
>> @@ -35,7 +35,7 @@
>> Vec x,b,u
>> PC pc
>> PetscInt n,dim,istart,iend
>> - PetscInt i,j,jj,ii,one,zero
>> + PetscInt i,j,jj,ii,one,two,zero
>> PetscErrorCode ierr
>> PetscScalar v
>> external MyResidual
>> @@ -50,6 +50,7 @@
>> pfive = .5d0
>> n = 6
>> dim = n*n
>> + two = 2
>> one = 1
>> zero = 0
>>
>> @@ -138,7 +139,7 @@
>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr)
>> call KSPGetPC(ksp,pc,ierr)
>> call PCSetType(pc,PCMG,ierr)
>> - call PCMGSetLevels(pc,one,PETSC_NULL_OBJECT,ierr)
>> + call PCMGSetLevels(pc,two,PETSC_NULL_OBJECT,ierr)
>> call PCMGSetResidual(pc,zero,PCMGDefaultResidual,A,ierr)
>>
>> call PCMGSetResidual(pc,zero,MyResidual,A,ierr)
>> @@ -147,6 +148,7 @@
>> ! also serves as the preconditioning matrix.
>>
>> call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr)
>> + call KSPView(ksp,PETSC_VIEWER_STDOUT_WORLD,ierr)
>>
>>
>> call KSPDestroy(ksp,ierr)
>>
>> $ mpirun.hydra -n 2 ./ex8f
>> KSP Object: 2 MPI processes
>> type: gmres
>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
>> Orthogonalization with no iterative refinement
>> GMRES: happy breakdown tolerance 1e-30
>> maximum iterations=10000, initial guess is zero
>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000
>> left preconditioning
>> using DEFAULT norm type for convergence test
>> PC Object: 2 MPI processes
>> type: mg
>> MG: type is MULTIPLICATIVE, levels=2 cycles=v
>> Cycles per PCApply=1
>> Not using Galerkin computed coarse grid matrices
>> Coarse grid solver -- level -------------------------------
>> KSP Object: (mg_coarse_) 2 MPI processes
>> type: preonly
>> maximum iterations=1, initial guess is zero
>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000
>> left preconditioning
>> using NONE norm type for convergence test
>> PC Object: (mg_coarse_) 2 MPI processes
>> type: redundant
>> Redundant preconditioner: Not yet setup
>> Down solver (pre-smoother) on level 1 -------------------------------
>> KSP Object: (mg_levels_1_) 2 MPI processes
>> type: chebyshev
>> Chebyshev: eigenvalue estimates: min = 0, max = 0
>> maximum iterations=2, initial guess is zero
>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000
>> left preconditioning
>> using NONE norm type for convergence test
>> PC Object: (mg_levels_1_) 2 MPI processes
>> type: sor
>> SOR: type = local_symmetric, iterations = 1, local iterations =
>> 1, omega = 1
>> Up solver (post-smoother) same as down solver (pre-smoother)
>> linear system matrix = precond matrix:
>> Matrix Object: 2 MPI processes
>> type: mpiaij
>> rows=36, cols=36
>> total: nonzeros=156, allocated nonzeros=252
>> total number of mallocs used during MatSetValues calls =0
>> not using I-node (on process 0) routines
>>
>>
>>
>> On Fri, Aug 31, 2012 at 7:27 PM, Gaetan Kenway <kenway at utias.utoronto.ca>wrote:
>>
>>> Hi
>>>
>>> I'm having an issue with the PCMG preconditioner in PETSc. I'm trying to
>>> use PCMG to precondition a linear system of equations resulting from the
>>> discretization of the EUler equations. It is from a multi-block code so I
>>> have the ability to easily generate restriction operators using geometric
>>> method.
>>>
>>> I'm currently trying to setup a simple 2V cycle. For this I use:
>>>
>>> ! Create the restriction operator between finest level and one below
>>> call createRestrictOperator(RL1, 1)
>>>
>>> call PCSetFromOptions(pc, PETScIerr)
>>> nlevels=2
>>> !call PCMGSetLevels(pc, nlevels, PETSC_NULL_OBJECT,PETScIerr) ! Doesn't
>>> work from fortran!!!
>>> call PCMGSetGalerkin(pc, PETSC_TRUE, PETScIerr)
>>> call PCMGSetRestriction(pc, 1, RL1, PETScierr)
>>>
>>> call PCView(pc, PETScierr)
>>>
>>> I run with the command line argument -pc_mg_levels 2 .
>>>
>>> First issue: PCMGSetLevels does not work correctly from Fortran. The
>>> traceback I get when I run (with the commented-out line above and without
>>> the command line arguments) is:
>>>
>>> [0]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
>>> probably memory access out of range
>>> [0]PETSC ERROR: Try option -start_in_debugger or
>>> -on_error_attach_debugger
>>> [0]PETSC ERROR: or see
>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try
>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory
>>> corruption errors
>>> [0]PETSC ERROR: likely location of problem given in stack below
>>> [0]PETSC ERROR: --------------------- Stack Frames
>>> ------------------------------------
>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
>>> available,
>>> [0]PETSC ERROR: INSTEAD the line number of the start of the
>>> function
>>> [0]PETSC ERROR: is given.
>>> [0]PETSC ERROR: [0] PetscCommDuplicate line 144
>>> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/sys/objects/tagm.c
>>> [0]PETSC ERROR: [0] PetscHeaderCreate_Private line 30
>>> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/sys/objects/inherit.c
>>> [0]PETSC ERROR: [0] KSPCreate line 557
>>> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/itcreate.c
>>> [0]PETSC ERROR: [0] PCMGSetLevels line 180
>>> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/impls/mg/mg.c
>>>
>>> I can get around that issue by using -pc_mg_levels 2 and not calling the
>>> PCMGSetLevels() command, but I would like to actually use this command and
>>> not have set it via manually setting an option. The result of the PCView is
>>> listed below which appears to be what I expect (given the default options)
>>>
>>> PC Object: 1 MPI processes
>>> type: mg
>>> MG: type is MULTIPLICATIVE, levels=2 cycles=v
>>> Cycles per PCApply=1
>>> Using Galerkin computed coarse grid matrices
>>> Coarse grid solver -- level -------------------------------
>>> KSP Object: (mg_coarse_) 1 MPI processes
>>> type: preonly
>>> maximum iterations=1, initial guess is zero
>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000
>>> left preconditioning
>>> using NONE norm type for convergence test
>>> PC Object: (mg_coarse_) 1 MPI processes
>>> type: lu
>>> LU: out-of-place factorization
>>> tolerance for zero pivot 2.22045e-14
>>> matrix ordering: nd
>>> Down solver (pre-smoother) on level 1 -------------------------------
>>> KSP Object: (mg_levels_1_) 1 MPI processes
>>> type: chebyshev
>>> Chebyshev: eigenvalue estimates: min = 0, max = 0
>>> maximum iterations=2, initial guess is zero
>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000
>>> left preconditioning
>>> using NONE norm type for convergence test
>>> PC Object: (mg_levels_1_) 1 MPI processes
>>> type: sor
>>> SOR: type = local_symmetric, iterations = 1, local iterations =
>>> 1, omega = 1
>>> Up solver (post-smoother) same as down solver (pre-smoother)
>>> linear system matrix followed by preconditioner matrix:
>>> Matrix Object: 1 MPI processes
>>> type: seqbaij
>>> rows=92160, cols=92160, bs=5
>>> total: nonzeros=5582400, allocated nonzeros=5582400
>>> total number of mallocs used during MatSetValues calls =0
>>> block size is 5
>>> Matrix Object: 1 MPI processes
>>> type: seqbaij
>>> rows=92160, cols=92160, bs=5
>>> total: nonzeros=3089600, allocated nonzeros=3089600
>>> total number of mallocs used during MatSetValues calls =0
>>> block size is 5
>>>
>>>
>>> So that's ok. When it gets to actually solving my system with KSPSolve()
>>> I get a returned error code 73 which is "object in argument is in wrong
>>> state, e.g. unassembled mat"
>>>
>>
>> In general, please don't paraphrase error messages.
>>
>> With these problems, try running in a debugger to get a real trace and
>> with valgrind. The most common problem is stack corruption due to passing
>> the wrong number or types of arguments.
>>
>>
>>>
>>> and the traceback is:
>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
>>> available,
>>> [0]PETSC ERROR: INSTEAD the line number of the start of the
>>> function
>>> [0]PETSC ERROR: is given.
>>> [0]PETSC ERROR: [0] MatGetVecs line 8142
>>> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/mat/interface/matrix.c
>>> [0]PETSC ERROR: [0] KSPGetVecs line 774
>>> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/iterativ.c
>>> [0]PETSC ERROR: [0] PCSetUp_MG line 508
>>> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/impls/mg/mg.c
>>> [0]PETSC ERROR: [0] PCSetUp line 810
>>> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/interface/precon.c
>>> [0]PETSC ERROR: [0] KSPSetUp line 182
>>> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/itfunc.c
>>> [0]PETSC ERROR: [0] KSPSolve line 351
>>> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/itfunc.c
>>>
>>> Which would indicate to me its an issue with one of the matrices
>>> involved, either the linear operator, A, the preconditioner P or my
>>> provided restriction matrix RL1. A and P i'm pretty sure are OK, since it
>>> works just fine when I use my usual BlockJacobi/Additive Schwartz with
>>> ILU(k) on the subdomains. Which leaves the Restriction operator as the
>>> issue. I added the following code before PCView()
>>>
>>> call MatGetVecs(RL1, vec1, vec2, ierr)
>>> call VecSet(vec1, 1.0, ierr)
>>> call MatMult(RL1, vec1, vec2, ierr)
>>>
>>> which ran through just fine (I check each of the ierrs but those are not
>>> shown).
>>>
>>> So I'm at a loss as to what is causing the error.
>>> I'm using an up-to-date copy of petsc-3.3 from the repository.
>>>
>>> Thanks,
>>>
>>> Gaetan
>>>
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120831/c5d96fff/attachment-0001.html>
More information about the petsc-users
mailing list