[petsc-users] MG Preconditioning
Jed Brown
jedbrown at mcs.anl.gov
Fri Aug 31 20:48:18 CDT 2012
src/ksp/pc/examples/tests/ex8f.F is a simple example, but I made the
modification below and it runs fine.
diff --git a/src/ksp/pc/examples/tests/ex8f.F
b/src/ksp/pc/examples/tests/ex8f.F
--- a/src/ksp/pc/examples/tests/ex8f.F
+++ b/src/ksp/pc/examples/tests/ex8f.F
@@ -35,7 +35,7 @@
Vec x,b,u
PC pc
PetscInt n,dim,istart,iend
- PetscInt i,j,jj,ii,one,zero
+ PetscInt i,j,jj,ii,one,two,zero
PetscErrorCode ierr
PetscScalar v
external MyResidual
@@ -50,6 +50,7 @@
pfive = .5d0
n = 6
dim = n*n
+ two = 2
one = 1
zero = 0
@@ -138,7 +139,7 @@
call KSPCreate(PETSC_COMM_WORLD,ksp,ierr)
call KSPGetPC(ksp,pc,ierr)
call PCSetType(pc,PCMG,ierr)
- call PCMGSetLevels(pc,one,PETSC_NULL_OBJECT,ierr)
+ call PCMGSetLevels(pc,two,PETSC_NULL_OBJECT,ierr)
call PCMGSetResidual(pc,zero,PCMGDefaultResidual,A,ierr)
call PCMGSetResidual(pc,zero,MyResidual,A,ierr)
@@ -147,6 +148,7 @@
! also serves as the preconditioning matrix.
call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr)
+ call KSPView(ksp,PETSC_VIEWER_STDOUT_WORLD,ierr)
call KSPDestroy(ksp,ierr)
$ mpirun.hydra -n 2 ./ex8f
KSP Object: 2 MPI processes
type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using DEFAULT norm type for convergence test
PC Object: 2 MPI processes
type: mg
MG: type is MULTIPLICATIVE, levels=2 cycles=v
Cycles per PCApply=1
Not using Galerkin computed coarse grid matrices
Coarse grid solver -- level -------------------------------
KSP Object: (mg_coarse_) 2 MPI processes
type: preonly
maximum iterations=1, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
PC Object: (mg_coarse_) 2 MPI processes
type: redundant
Redundant preconditioner: Not yet setup
Down solver (pre-smoother) on level 1 -------------------------------
KSP Object: (mg_levels_1_) 2 MPI processes
type: chebyshev
Chebyshev: eigenvalue estimates: min = 0, max = 0
maximum iterations=2, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
PC Object: (mg_levels_1_) 2 MPI processes
type: sor
SOR: type = local_symmetric, iterations = 1, local iterations = 1,
omega = 1
Up solver (post-smoother) same as down solver (pre-smoother)
linear system matrix = precond matrix:
Matrix Object: 2 MPI processes
type: mpiaij
rows=36, cols=36
total: nonzeros=156, allocated nonzeros=252
total number of mallocs used during MatSetValues calls =0
not using I-node (on process 0) routines
On Fri, Aug 31, 2012 at 7:27 PM, Gaetan Kenway <kenway at utias.utoronto.ca>wrote:
> Hi
>
> I'm having an issue with the PCMG preconditioner in PETSc. I'm trying to
> use PCMG to precondition a linear system of equations resulting from the
> discretization of the EUler equations. It is from a multi-block code so I
> have the ability to easily generate restriction operators using geometric
> method.
>
> I'm currently trying to setup a simple 2V cycle. For this I use:
>
> ! Create the restriction operator between finest level and one below
> call createRestrictOperator(RL1, 1)
>
> call PCSetFromOptions(pc, PETScIerr)
> nlevels=2
> !call PCMGSetLevels(pc, nlevels, PETSC_NULL_OBJECT,PETScIerr) ! Doesn't
> work from fortran!!!
> call PCMGSetGalerkin(pc, PETSC_TRUE, PETScIerr)
> call PCMGSetRestriction(pc, 1, RL1, PETScierr)
>
> call PCView(pc, PETScierr)
>
> I run with the command line argument -pc_mg_levels 2 .
>
> First issue: PCMGSetLevels does not work correctly from Fortran. The
> traceback I get when I run (with the commented-out line above and without
> the command line arguments) is:
>
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try
> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory
> corruption errors
> [0]PETSC ERROR: likely location of problem given in stack below
> [0]PETSC ERROR: --------------------- Stack Frames
> ------------------------------------
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,
> [0]PETSC ERROR: INSTEAD the line number of the start of the function
> [0]PETSC ERROR: is given.
> [0]PETSC ERROR: [0] PetscCommDuplicate line 144
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/sys/objects/tagm.c
> [0]PETSC ERROR: [0] PetscHeaderCreate_Private line 30
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/sys/objects/inherit.c
> [0]PETSC ERROR: [0] KSPCreate line 557
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/itcreate.c
> [0]PETSC ERROR: [0] PCMGSetLevels line 180
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/impls/mg/mg.c
>
> I can get around that issue by using -pc_mg_levels 2 and not calling the
> PCMGSetLevels() command, but I would like to actually use this command and
> not have set it via manually setting an option. The result of the PCView is
> listed below which appears to be what I expect (given the default options)
>
> PC Object: 1 MPI processes
> type: mg
> MG: type is MULTIPLICATIVE, levels=2 cycles=v
> Cycles per PCApply=1
> Using Galerkin computed coarse grid matrices
> Coarse grid solver -- level -------------------------------
> KSP Object: (mg_coarse_) 1 MPI processes
> type: preonly
> maximum iterations=1, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000
> left preconditioning
> using NONE norm type for convergence test
> PC Object: (mg_coarse_) 1 MPI processes
> type: lu
> LU: out-of-place factorization
> tolerance for zero pivot 2.22045e-14
> matrix ordering: nd
> Down solver (pre-smoother) on level 1 -------------------------------
> KSP Object: (mg_levels_1_) 1 MPI processes
> type: chebyshev
> Chebyshev: eigenvalue estimates: min = 0, max = 0
> maximum iterations=2, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000
> left preconditioning
> using NONE norm type for convergence test
> PC Object: (mg_levels_1_) 1 MPI processes
> type: sor
> SOR: type = local_symmetric, iterations = 1, local iterations = 1,
> omega = 1
> Up solver (post-smoother) same as down solver (pre-smoother)
> linear system matrix followed by preconditioner matrix:
> Matrix Object: 1 MPI processes
> type: seqbaij
> rows=92160, cols=92160, bs=5
> total: nonzeros=5582400, allocated nonzeros=5582400
> total number of mallocs used during MatSetValues calls =0
> block size is 5
> Matrix Object: 1 MPI processes
> type: seqbaij
> rows=92160, cols=92160, bs=5
> total: nonzeros=3089600, allocated nonzeros=3089600
> total number of mallocs used during MatSetValues calls =0
> block size is 5
>
>
> So that's ok. When it gets to actually solving my system with KSPSolve() I
> get a returned error code 73 which is "object in argument is in wrong
> state, e.g. unassembled mat"
>
In general, please don't paraphrase error messages.
With these problems, try running in a debugger to get a real trace and with
valgrind. The most common problem is stack corruption due to passing the
wrong number or types of arguments.
>
> and the traceback is:
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,
> [0]PETSC ERROR: INSTEAD the line number of the start of the function
> [0]PETSC ERROR: is given.
> [0]PETSC ERROR: [0] MatGetVecs line 8142
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/mat/interface/matrix.c
> [0]PETSC ERROR: [0] KSPGetVecs line 774
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/iterativ.c
> [0]PETSC ERROR: [0] PCSetUp_MG line 508
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/impls/mg/mg.c
> [0]PETSC ERROR: [0] PCSetUp line 810
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: [0] KSPSetUp line 182
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: [0] KSPSolve line 351
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/itfunc.c
>
> Which would indicate to me its an issue with one of the matrices involved,
> either the linear operator, A, the preconditioner P or my provided
> restriction matrix RL1. A and P i'm pretty sure are OK, since it works just
> fine when I use my usual BlockJacobi/Additive Schwartz with ILU(k) on the
> subdomains. Which leaves the Restriction operator as the issue. I added the
> following code before PCView()
>
> call MatGetVecs(RL1, vec1, vec2, ierr)
> call VecSet(vec1, 1.0, ierr)
> call MatMult(RL1, vec1, vec2, ierr)
>
> which ran through just fine (I check each of the ierrs but those are not
> shown).
>
> So I'm at a loss as to what is causing the error.
> I'm using an up-to-date copy of petsc-3.3 from the repository.
>
> Thanks,
>
> Gaetan
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120831/759f8ebd/attachment.html>
More information about the petsc-users
mailing list