[petsc-users] MG Preconditioning
Gaetan Kenway
kenway at utias.utoronto.ca
Fri Aug 31 19:27:06 CDT 2012
Hi
I'm having an issue with the PCMG preconditioner in PETSc. I'm trying to
use PCMG to precondition a linear system of equations resulting from the
discretization of the EUler equations. It is from a multi-block code so I
have the ability to easily generate restriction operators using geometric
method.
I'm currently trying to setup a simple 2V cycle. For this I use:
! Create the restriction operator between finest level and one below
call createRestrictOperator(RL1, 1)
call PCSetFromOptions(pc, PETScIerr)
nlevels=2
!call PCMGSetLevels(pc, nlevels, PETSC_NULL_OBJECT,PETScIerr) ! Doesn't
work from fortran!!!
call PCMGSetGalerkin(pc, PETSC_TRUE, PETScIerr)
call PCMGSetRestriction(pc, 1, RL1, PETScierr)
call PCView(pc, PETScierr)
I run with the command line argument -pc_mg_levels 2 .
First issue: PCMGSetLevels does not work correctly from Fortran. The
traceback I get when I run (with the commented-out line above and without
the command line arguments) is:
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC ERROR:
or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory
corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: --------------------- Stack Frames
------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR: INSTEAD the line number of the start of the function
[0]PETSC ERROR: is given.
[0]PETSC ERROR: [0] PetscCommDuplicate line 144
/nfs/mica/home/kenway/Downloads/petsc-3.3/src/sys/objects/tagm.c
[0]PETSC ERROR: [0] PetscHeaderCreate_Private line 30
/nfs/mica/home/kenway/Downloads/petsc-3.3/src/sys/objects/inherit.c
[0]PETSC ERROR: [0] KSPCreate line 557
/nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/itcreate.c
[0]PETSC ERROR: [0] PCMGSetLevels line 180
/nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/impls/mg/mg.c
I can get around that issue by using -pc_mg_levels 2 and not calling the
PCMGSetLevels() command, but I would like to actually use this command and
not have set it via manually setting an option. The result of the PCView is
listed below which appears to be what I expect (given the default options)
PC Object: 1 MPI processes
type: mg
MG: type is MULTIPLICATIVE, levels=2 cycles=v
Cycles per PCApply=1
Using Galerkin computed coarse grid matrices
Coarse grid solver -- level -------------------------------
KSP Object: (mg_coarse_) 1 MPI processes
type: preonly
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
PC Object: (mg_coarse_) 1 MPI processes
type: lu
LU: out-of-place factorization
tolerance for zero pivot 2.22045e-14
matrix ordering: nd
Down solver (pre-smoother) on level 1 -------------------------------
KSP Object: (mg_levels_1_) 1 MPI processes
type: chebyshev
Chebyshev: eigenvalue estimates: min = 0, max = 0
maximum iterations=2, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
PC Object: (mg_levels_1_) 1 MPI processes
type: sor
SOR: type = local_symmetric, iterations = 1, local iterations = 1,
omega = 1
Up solver (post-smoother) same as down solver (pre-smoother)
linear system matrix followed by preconditioner matrix:
Matrix Object: 1 MPI processes
type: seqbaij
rows=92160, cols=92160, bs=5
total: nonzeros=5582400, allocated nonzeros=5582400
total number of mallocs used during MatSetValues calls =0
block size is 5
Matrix Object: 1 MPI processes
type: seqbaij
rows=92160, cols=92160, bs=5
total: nonzeros=3089600, allocated nonzeros=3089600
total number of mallocs used during MatSetValues calls =0
block size is 5
So that's ok. When it gets to actually solving my system with KSPSolve() I
get a returned error code 73 which is "object in argument is in wrong
state, e.g. unassembled mat"
and the traceback is:
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR: INSTEAD the line number of the start of the function
[0]PETSC ERROR: is given.
[0]PETSC ERROR: [0] MatGetVecs line 8142
/nfs/mica/home/kenway/Downloads/petsc-3.3/src/mat/interface/matrix.c
[0]PETSC ERROR: [0] KSPGetVecs line 774
/nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/iterativ.c
[0]PETSC ERROR: [0] PCSetUp_MG line 508
/nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: [0] PCSetUp line 810
/nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: [0] KSPSetUp line 182
/nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: [0] KSPSolve line 351
/nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/itfunc.c
Which would indicate to me its an issue with one of the matrices involved,
either the linear operator, A, the preconditioner P or my provided
restriction matrix RL1. A and P i'm pretty sure are OK, since it works just
fine when I use my usual BlockJacobi/Additive Schwartz with ILU(k) on the
subdomains. Which leaves the Restriction operator as the issue. I added the
following code before PCView()
call MatGetVecs(RL1, vec1, vec2, ierr)
call VecSet(vec1, 1.0, ierr)
call MatMult(RL1, vec1, vec2, ierr)
which ran through just fine (I check each of the ierrs but those are not
shown).
So I'm at a loss as to what is causing the error.
I'm using an up-to-date copy of petsc-3.3 from the repository.
Thanks,
Gaetan
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120831/aff57fba/attachment.html>
More information about the petsc-users
mailing list