[petsc-users] mg pre-conditioner default setup from PETSc-3.4 to PETSc-3.7
Mark Adams
mfadams at lbl.gov
Wed Sep 13 10:48:05 CDT 2017
Two iterations for the eigen estimate is too low and gmres converges
slowly. I'm surprised this does not diverge, or just die, for a Laplacian
because you need to get an upper bound. Cheby will scale the estimate up by
some safety factor (is it really large now?). Try: -mg_levels_esteig_ksp_max_it
10 (the old default). I usually use 5.
Also, I would suggest using cg (-mg_levels_esteig_ksp_type cg), it
converges much faster. If your problem is not very asymmetric, it is fine.
On Wed, Sep 13, 2017 at 11:35 AM, Hong <hzhang at mcs.anl.gov> wrote:
> Federico :
>
>>
>> Coarse grid solver -- level -------------------------------
>> KSP Object: (mg_levels_0_) 128 MPI processes
>> type: chebyshev
>> Chebyshev: eigenvalue estimates: min = 0.223549, max = 2.45903
>> Chebyshev: eigenvalues estimated using gmres with translations
>> [0. 0.1; 0. 1.1]
>> KSP Object: (mg_levels_0_esteig_) 128 MPI processes
>> type: gmres
>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
>> Orthogonalization with no iterative refinement
>> GMRES: happy breakdown tolerance 1e-30
>> maximum iterations=10, initial guess is zero
>> *tolerances: relative=1e-12*, absolute=1e-50,
>> divergence=10000.
>> left preconditioning
>> *using PRECONDITIONED norm type for convergence test*
>> maximum iterations=2, initial guess is zero
>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
>> left preconditioning
>> using NONE norm type for convergence test
>>
>
> Chebyshev requires an estimate of operator eigenvalues, for which we use
> few gmres iterations. These default options are used for eigenvalue
> estimates.
>
> Hong
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170913/c0e65ad0/attachment.html>
More information about the petsc-users
mailing list