[petsc-users] Usage of AMG as preconditioner

Matthew Knepley knepley at gmail.com
Fri Sep 28 07:18:51 CDT 2018


On Fri, Sep 28, 2018 at 8:13 AM Michael Werner <michael.werner at dlr.de>
wrote:

>
> Matthew Knepley writes:
>
> > On Fri, Sep 28, 2018 at 7:43 AM Michael Werner
> > <michael.werner at dlr.de>
> > wrote:
> >
> >>
> >> Matthew Knepley writes:
> >>
> >> > On Fri, Sep 28, 2018 at 3:23 AM Michael Werner
> >> > <michael.werner at dlr.de>
> >> > wrote:
> >> >
> >> >> Hello,
> >> >>
> >> >> I'm having trouble with getting the AMG preconditioners
> >> >> working. I
> >> >> tried all of them (gamg, ml, hypre-boomeramg), with varying
> >> >> degrees of "success":
> >> >>
> >> >> - GAMG:
> >> >> CMD options: -ksp_rtol 1e-8 -ksp_monitor_true_residual
> >> >> -ksp_max_it
> >> >> 20 -ksp_type fgmres -pc_type gamg -pc_gamg_sym_graph TRUE
> >> >> -pc_gamg_agg_nsmooths 1 -ksp_view
> >> >> always crashes with the following error:
> >> >> Error: error code 77
> >> >> [0] KSPSolve() line 780 in
> >> >>
> >> >>
> >>
> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/ksp/interface/itfunc.c
> >> >> [0] KSPSolve_GMRES() line 233 in
> >> >>
> >> >>
> >>
> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/ksp/impls/gmres/gmres.c
> >> >> [0] KSPInitialResidual() line 67 in
> >> >>
> >> >>
> >>
> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/ksp/interface/itres.c
> >> >> [0] KSP_PCApply() line 281 in
> >> >>
> >> >>
> >>
> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/include/petsc/private/kspimpl.h
> >> >> [0] PCApply() line 462 in
> >> >>
> >> >>
> >>
> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/pc/interface/precon.c
> >> >> [0] PCApply_MG() line 377 in
> >> >>
> >> >>
> >>
> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/pc/impls/mg/mg.c
> >> >> [0] PCMGMCycle_Private() line 20 in
> >> >>
> >> >>
> >>
> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/pc/impls/mg/mg.c
> >> >> [0] KSPSolve() line 780 in
> >> >>
> >> >>
> >>
> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/ksp/interface/itfunc.c
> >> >> [0] KSPSolve_Chebyshev() line 381 in
> >> >>
> >> >>
> >>
> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/ksp/impls/cheby/cheby.c
> >> >> [0] Petsc has generated inconsistent data
> >> >> [0] Eigen estimator failed: DIVERGED_NANORINF at iteration 0
> >> >>
> >> >> When I'm using a different solver for -mg_levels_ksp_type,
> >> >> such
> >> >> as
> >> >> gmres, GAMG no longer crashes, but I don't see convergence
> >> >> of
> >> >> the
> >> >> problem (convergence history and ksp_view output are
> >> >> attached
> >> >> below).
> >> >>
> >> >
> >> > It uses unpreconditioned GMRES to estimate spectral bounds
> >> > for
> >> > the operator
> >> > before using a Chebychev smoother.
> >> > If your matrix does not have a nice, connected, positive
> >> > spectrum,
> >> > Chebychev will not work. However, the fact that
> >> > you get DIVERGED_NANORINF in the estimator tells me that you
> >> > have a problem
> >> > in the matrix.
> >> >
> >>
> >> The error above (DIVERGED_NANORINF) only appears for
> >> -mg_levels_ksp_type chebyshev. When I use GMRES
> >> (-mg_levels_ksp_type gmres) there are no errors, the KSP just
> >> never converges.
> >>
> >> >
> >> >> - Hypre
> >> >> With the default settings, BoomerAMG just returns a Vector
> >> >> of
> >> >> all
> >> >> zeros after one iteration.
> >> >> When I change the relaxation type
> >> >> -pc_hypre_boomeramg_relax_type_all to Jacobi, I get similar
> >> >> results than with GAMG: the solver works without errors, but
> >> >> doesn't converge. The output for Hypre is also attached
> >> >> below.
> >> >>
> >> >> - ML
> >> >> With default settings the result is just like Boomeramg: a
> >> >> vector
> >> >> of all zeros after one iteration.
> >> >> When I change -mg_levels_ksp_type the behaviour is identical
> >> >> to
> >> >> GAMG.
> >> >>
> >> >>
> >> >> Since none of the packages worked, I'm assuming that the
> >> >> error
> >> >> lies with me/ my code,
> >> >
> >> >
> >> > It looks like a value in the matrix might be bad.
> >> >
> >> >
> >> >> so I'll give a short overview over what I'm
> >> >> trying to do.
> >> >> The matrix I'm trying to precondition is the Jacobian of a
> >> >> flow
> >> >> field originating from an unstructured finite-volume CFD
> >> >> code.
> >
> >
> > Compressible or incompressible?
> >
> >   Mstt
> >
>
> The matrix is generated by a compressible CFD code. The current
> test case is quasi-incompressible due to very low flow velocity
> (flow around a cylinder).
>

Ah. I do not believe Multigrid is going to work for compressible flow, at
least not the
straightforward kind of multigrid that we have in PETSc. MG is designed for
elliptic
problems. Hyprebolic problems require extensive changes and its still a
research problem
to get it working. Definitely no freely available package has it working,
and others cannot be
tested by us.

Moreover, in the incompressible limit, the compressible formulation can
become very stiff,
leading to failure of solvers. That is why you get the slightly
compressible (adiabatic?) formulations.

  Thanks,

     Matt


> Kind regards,
> Michael
>
> >
> >> It
> >> >> has a blocked structure as each node of the original mesh
> >> >> holds
> >> >> several variables (pressure, density, velocities). However,
> >> >> I'm
> >> >> not using a DM-Plex since I get the assembled Jacobian in
> >> >> binary
> >> >> format from the external CFD code.
> >> >> When I'm using direct (lu) I get correct results, so the
> >> >> basic
> >> >> code should be fine.
> >> >
> >> >
> >> > So LU works for the same problem that gave the NANORINF in
> >> > GMRES?
> >> >
> >> > I would recommend using a Manufactured Solution to check the
> >> > operation. Then
> >> > you can start by feeding in the exact solution, and see that
> >> > nothing fails.
> >> > Also, I
> >> > would make a very small test case, so that you can send the
> >> > matrix and/or
> >> > code
> >> > to us to check.
> >> >
> >>
> >> Yes, LU works for the same matrix/ problem. I didn't change
> >> anything besides using -pc_type lu instead of -pc_type gamg.
> >> What do you mean by "Manufactured Solution"?
> >> Concerning the test case: I'll set one up and send you the
> >> matrix
> >> in binary format.
> >>
> >> >
> >> >> However, for larger problems lu is not
> >> >> feasible due the very large memory requirements, therefore I
> >> >> wanted to switch over to multigrid.
> >> >>
> >> >> Currently, PETSc has no information about the original
> >> >> geometry. I
> >> >> tried using setCoordinates, but for Hypre and ml it didn't
> >> >> make
> >> >> a
> >> >> difference, and GAMG crashed with an error:
> >> >>
> >> >
> >> > This is only an optimization, and currently GAMG does not
> >> > know
> >> > what to
> >> > do for multiple fields (it handles single field problems by
> >> > building the
> >> > nullspace of the symbol for the coarse grid). You could
> >> > provide
> >> > this when
> >> > you want to optimize things.
> >> >
> >> >   Thanks,
> >> >
> >> >     Matt
> >>
> >> Ok, thanks for the clarification. In that case I won't bother
> >> with
> >> it for now. No need to optimize the code before the other
> >> problems
> >> are solved.
> >>
> >> Thank you for your answer!
> >>
> >> Kind regards,
> >> Michael
> >> >
> >> >
> >> >> [0] PCSetCoordinates() line 1883 in
> >> >>
> >> >>
> >>
> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/pc/interface/precon.c
> >> >> [0] PCSetCoordinates_AGG() line 199 in
> >> >>
> >> >>
> >>
> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/pc/impls/gamg/agg.c
> >> >> [0] Petsc has generated inconsistent data
> >> >> [0] Don't know how to create null space for ndm=2, ndf=4.
> >> >> Use
> >> >> MatSetNearNullSpace.
> >> >>
> >> >> I would be glad if you could give me some advice on how to
> >> >> deal
> >> >> with this.
> >> >>
> >> >> Kind regards,
> >> >> Michael
> >> >>
> >> >>
> >>
> >>
>
>
> --
> ____________________________________________________
>
> Deutsches Zentrum für Luft- und Raumfahrt e.V. (DLR)
> Institut für Aerodynamik und Strömungstechnik | Bunsenstr. 10 |
> 37073 Göttingen
>
> Michael Werner
> Telefon 0551 709-2627 | Telefax 0551 709-2811 |
> Michael.Werner at dlr.de
> DLR.de
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180928/dd46aff8/attachment-0001.html>


More information about the petsc-users mailing list