[petsc-users] PCMG Solver Issue
Jacob Oest
jacoboest87 at gmail.com
Thu Apr 4 15:43:12 CDT 2019
Dear all,
I am using petsc4py and can't seem to figure out an issue.
I need to solve the linear system KU = P, and K and U are correctly set up,
and using default ksp solvers i can easily solve the system. However, I am
trying to solve with pcmg and have two issues:
a) I get 'PETSC Error code 73 - object is in wrong state' when i try to
solve or setup the ksp.
b) What is the petsc4py equivalent to PCMGSetGalerkin? Can't seem to figure
it out.
Below is how I set up the solver, next is the error message, and lastly is
the result from KSPView()
Thanks in advance for any help you can provide!
*THE CODE:*
RTOL = 1.0e-5
ATOL = 1.0e-50
DTOL = 1.0e3
MAXITSGLOBAL = 200
COARSE_RTOL = 1.0e-8
COARSE_ATOL = 1.0e-50
COARSE_DTOL = 1e3
COARSE_MAXITS = 30
COARSE_RESTART = 30
SMOOTH_SWEEPS = 4
# Fine grid Krylov Method
self.ksp = petsc.KSP().create()
self.ksp.setType('fgmres')
self.ksp.setTolerances(rtol=RTOL, atol=ATOL, divtol=DTOL,
max_it=MAXITSGLOBAL)
self.ksp.setInitialGuessNonzero(True)
# Set operators
self.ksp.setOperators(mesh.K, mesh.K)
# Get Preconditioner and set it up
pc = self.ksp.getPC()
pc.setType('mg')
self.ksp.setFromOptions()
# Set all levels in the PCMG (zeroth entry is finest level):
da_list = list()
da_list.append(mesh.da_nodes)
da_list.extend([da_obj for da_obj in
mesh.da_nodes.coarsenHierarchy(3)])
# PCMG specific options
pc.setMGLevels(mesh.mg_levels)
pc.setMGType(0) # multiplicative
pc.setMGCycleType(1) # v
for k in range(1, mesh.mg_levels):
r, _ = da_list[k].createInterpolation(da_list[k-1])
pc.setMGInterpolation(k, r)
# Avoid the default for the mg part
cksp = pc.getMGCoarseSolve()
cksp.setType('gmres')
cksp.setGMRESRestart(COARSE_RESTART)
cksp.setTolerances(rtol=COARSE_RTOL, atol=COARSE_ATOL,
divtol=COARSE_DTOL, max_it=COARSE_MAXITS)
cpc = cksp.getPC()
cpc.setType('sor')
# Set smoothers on all levels
for k in range(1, mesh.mg_levels):
dksp = pc.getMGSmoother(k)
dpc = dksp.getPC()
dksp.setType('gmres')
dksp.setGMRESRestart(SMOOTH_SWEEPS)
dksp.setTolerances(max_it=SMOOTH_SWEEPS)
dpc.setType('sor')
*The Error:*
petsc4py.PETSc.Error: error code 73
[0] KSPSolve() line 725 in
/tmp/pip-req-build-goqa79r0/src/ksp/ksp/interface/itfunc.c
[0] KSPSetUp() line 391 in
/tmp/pip-req-build-goqa79r0/src/ksp/ksp/interface/itfunc.c
[0] PCSetUp() line 932 in
/tmp/pip-req-build-goqa79r0/src/ksp/pc/interface/precon.c
[0] PCSetUp_MG() line 819 in
/tmp/pip-req-build-goqa79r0/src/ksp/pc/impls/mg/mg.c
[0] KSPCreateVecs() line 947 in
/tmp/pip-req-build-goqa79r0/src/ksp/ksp/interface/iterativ.c
[0] Object is in wrong state
[0] You requested a vector from a KSP that cannot provide one
*The KSPView():*
KSP Object: 1 MPI processes
type: fgmres
restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization
with no iterative refinement
happy breakdown tolerance 1e-30
maximum iterations=200, nonzero initial guess
tolerances: relative=1e-05, absolute=1e-50, divergence=1000.
left preconditioning
using DEFAULT norm type for convergence test
PC Object: 1 MPI processes
type: mg
PC has not been set up so information may be incomplete
type is MULTIPLICATIVE, levels=4 cycles=v
Cycles per PCApply=1
Not using Galerkin computed coarse grid matrices
Coarse grid solver -- level -------------------------------
KSP Object: (mg_coarse_) 1 MPI processes
type: gmres
restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
happy breakdown tolerance 1e-30
maximum iterations=30, initial guess is zero
tolerances: relative=1e-08, absolute=1e-50, divergence=1000.
left preconditioning
using DEFAULT norm type for convergence test
PC Object: (mg_coarse_) 1 MPI processes
type: sor
PC has not been set up so information may be incomplete
type = local_symmetric, iterations = 1, local iterations = 1, omega
= 1.
Down solver (pre-smoother) on level 1 -------------------------------
KSP Object: (mg_levels_1_) 1 MPI processes
type: gmres
restart=4, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
happy breakdown tolerance 1e-30
maximum iterations=4, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
left preconditioning
using NONE norm type for convergence test
PC Object: (mg_levels_1_) 1 MPI processes
type: sor
PC has not been set up so information may be incomplete
type = local_symmetric, iterations = 1, local iterations = 1, omega
= 1.
Up solver (post-smoother) same as down solver (pre-smoother)
Down solver (pre-smoother) on level 2 -------------------------------
KSP Object: (mg_levels_2_) 1 MPI processes
type: gmres
restart=4, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
happy breakdown tolerance 1e-30
maximum iterations=4, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
left preconditioning
using NONE norm type for convergence test
PC Object: (mg_levels_2_) 1 MPI processes
type: sor
PC has not been set up so information may be incomplete
type = local_symmetric, iterations = 1, local iterations = 1, omega
= 1.
Up solver (post-smoother) same as down solver (pre-smoother)
Down solver (pre-smoother) on level 3 -------------------------------
KSP Object: (mg_levels_3_) 1 MPI processes
type: gmres
restart=4, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
happy breakdown tolerance 1e-30
maximum iterations=4, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
left preconditioning
using NONE norm type for convergence test
PC Object: (mg_levels_3_) 1 MPI processes
type: sor
PC has not been set up so information may be incomplete
type = local_symmetric, iterations = 1, local iterations = 1, omega
= 1.
Up solver (post-smoother) same as down solver (pre-smoother)
linear system matrix = precond matrix:
Mat Object: 1 MPI processes
type: seqaij
rows=4131, cols=4131, bs=3
total: nonzeros=275625, allocated nonzeros=275625
total number of mallocs used during MatSetValues calls =0
using I-node routines: found 1377 nodes, limit used is 5
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190404/86a74ed0/attachment.html>
More information about the petsc-users
mailing list