<div dir="ltr"><div class="gmail_quote"><div dir="ltr">On Fri, Sep 28, 2018 at 7:43 AM Michael Werner <<a href="mailto:michael.werner@dlr.de">michael.werner@dlr.de</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br>
Matthew Knepley writes:<br>
<br>
> On Fri, Sep 28, 2018 at 3:23 AM Michael Werner <br>
> <<a href="mailto:michael.werner@dlr.de" target="_blank">michael.werner@dlr.de</a>><br>
> wrote:<br>
><br>
>> Hello,<br>
>><br>
>> I'm having trouble with getting the AMG preconditioners <br>
>> working. I<br>
>> tried all of them (gamg, ml, hypre-boomeramg), with varying<br>
>> degrees of "success":<br>
>><br>
>> - GAMG:<br>
>> CMD options: -ksp_rtol 1e-8 -ksp_monitor_true_residual <br>
>> -ksp_max_it<br>
>> 20 -ksp_type fgmres -pc_type gamg -pc_gamg_sym_graph TRUE<br>
>> -pc_gamg_agg_nsmooths 1 -ksp_view<br>
>> always crashes with the following error:<br>
>> Error: error code 77<br>
>> [0] KSPSolve() line 780 in<br>
>><br>
>> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/ksp/interface/itfunc.c<br>
>> [0] KSPSolve_GMRES() line 233 in<br>
>><br>
>> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/ksp/impls/gmres/gmres.c<br>
>> [0] KSPInitialResidual() line 67 in<br>
>><br>
>> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/ksp/interface/itres.c<br>
>> [0] KSP_PCApply() line 281 in<br>
>><br>
>> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/include/petsc/private/kspimpl.h<br>
>> [0] PCApply() line 462 in<br>
>><br>
>> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/pc/interface/precon.c<br>
>> [0] PCApply_MG() line 377 in<br>
>><br>
>> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/pc/impls/mg/mg.c<br>
>> [0] PCMGMCycle_Private() line 20 in<br>
>><br>
>> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/pc/impls/mg/mg.c<br>
>> [0] KSPSolve() line 780 in<br>
>><br>
>> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/ksp/interface/itfunc.c<br>
>> [0] KSPSolve_Chebyshev() line 381 in<br>
>><br>
>> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/ksp/impls/cheby/cheby.c<br>
>> [0] Petsc has generated inconsistent data<br>
>> [0] Eigen estimator failed: DIVERGED_NANORINF at iteration 0<br>
>><br>
>> When I'm using a different solver for -mg_levels_ksp_type, such <br>
>> as<br>
>> gmres, GAMG no longer crashes, but I don't see convergence of <br>
>> the<br>
>> problem (convergence history and ksp_view output are attached<br>
>> below).<br>
>><br>
><br>
> It uses unpreconditioned GMRES to estimate spectral bounds for <br>
> the operator<br>
> before using a Chebychev smoother.<br>
> If your matrix does not have a nice, connected, positive <br>
> spectrum,<br>
> Chebychev will not work. However, the fact that<br>
> you get DIVERGED_NANORINF in the estimator tells me that you <br>
> have a problem<br>
> in the matrix.<br>
><br>
<br>
The error above (DIVERGED_NANORINF) only appears for <br>
-mg_levels_ksp_type chebyshev. When I use GMRES <br>
(-mg_levels_ksp_type gmres) there are no errors, the KSP just <br>
never converges.<br>
<br>
><br>
>> - Hypre<br>
>> With the default settings, BoomerAMG just returns a Vector of <br>
>> all<br>
>> zeros after one iteration.<br>
>> When I change the relaxation type<br>
>> -pc_hypre_boomeramg_relax_type_all to Jacobi, I get similar<br>
>> results than with GAMG: the solver works without errors, but<br>
>> doesn't converge. The output for Hypre is also attached below.<br>
>><br>
>> - ML<br>
>> With default settings the result is just like Boomeramg: a <br>
>> vector<br>
>> of all zeros after one iteration.<br>
>> When I change -mg_levels_ksp_type the behaviour is identical to<br>
>> GAMG.<br>
>><br>
>><br>
>> Since none of the packages worked, I'm assuming that the error<br>
>> lies with me/ my code,<br>
><br>
><br>
> It looks like a value in the matrix might be bad.<br>
><br>
><br>
>> so I'll give a short overview over what I'm<br>
>> trying to do.<br>
>> The matrix I'm trying to precondition is the Jacobian of a flow<br>
>> field originating from an unstructured finite-volume CFD <br>
>> code.</blockquote><div><br></div><div>Compressible or incompressible?</div><div><br></div><div> Mstt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"> It<br>
>> has a blocked structure as each node of the original mesh holds<br>
>> several variables (pressure, density, velocities). However, I'm<br>
>> not using a DM-Plex since I get the assembled Jacobian in <br>
>> binary<br>
>> format from the external CFD code.<br>
>> When I'm using direct (lu) I get correct results, so the basic<br>
>> code should be fine.<br>
><br>
><br>
> So LU works for the same problem that gave the NANORINF in <br>
> GMRES?<br>
><br>
> I would recommend using a Manufactured Solution to check the <br>
> operation. Then<br>
> you can start by feeding in the exact solution, and see that <br>
> nothing fails.<br>
> Also, I<br>
> would make a very small test case, so that you can send the <br>
> matrix and/or<br>
> code<br>
> to us to check.<br>
><br>
<br>
Yes, LU works for the same matrix/ problem. I didn't change <br>
anything besides using -pc_type lu instead of -pc_type gamg.<br>
What do you mean by "Manufactured Solution"?<br>
Concerning the test case: I'll set one up and send you the matrix <br>
in binary format. <br>
<br>
><br>
>> However, for larger problems lu is not<br>
>> feasible due the very large memory requirements, therefore I<br>
>> wanted to switch over to multigrid.<br>
>><br>
>> Currently, PETSc has no information about the original <br>
>> geometry. I<br>
>> tried using setCoordinates, but for Hypre and ml it didn't make <br>
>> a<br>
>> difference, and GAMG crashed with an error:<br>
>><br>
><br>
> This is only an optimization, and currently GAMG does not know <br>
> what to<br>
> do for multiple fields (it handles single field problems by <br>
> building the<br>
> nullspace of the symbol for the coarse grid). You could provide <br>
> this when<br>
> you want to optimize things.<br>
><br>
> Thanks,<br>
><br>
> Matt<br>
<br>
Ok, thanks for the clarification. In that case I won't bother with <br>
it for now. No need to optimize the code before the other problems <br>
are solved.<br>
<br>
Thank you for your answer!<br>
<br>
Kind regards,<br>
Michael<br>
><br>
><br>
>> [0] PCSetCoordinates() line 1883 in<br>
>><br>
>> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/pc/interface/precon.c<br>
>> [0] PCSetCoordinates_AGG() line 199 in<br>
>><br>
>> /home/yoda/wern_mc/Programme/Anaconda/envs/GSA_27/weitere_programme/petsc-3.10.0/src/ksp/pc/impls/gamg/agg.c<br>
>> [0] Petsc has generated inconsistent data<br>
>> [0] Don't know how to create null space for ndm=2, ndf=4. Use<br>
>> MatSetNearNullSpace.<br>
>><br>
>> I would be glad if you could give me some advice on how to deal<br>
>> with this.<br>
>><br>
>> Kind regards,<br>
>> Michael<br>
>><br>
>><br>
<br>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>