[petsc-users] GAMG preconditioning
Mark Adams
mfadams at lbl.gov
Mon Apr 12 11:46:54 CDT 2021
Can you briefly describe your application,?
AMG usually only works well for straightforward elliptic problems, at least
right out of the box.
On Mon, Apr 12, 2021 at 11:35 AM Milan Pelletier via petsc-users <
petsc-users at mcs.anl.gov> wrote:
> Dear all,
>
> I am currently trying to use PETSc with CG solver and GAMG preconditioner.
> I have started with the following set of parameters:
> -ksp_type cg
> -pc_type gamg
> -pc_gamg_agg_nsmooths 1
> -pc_gamg_threshold 0.02
> -mg_levels_ksp_type chebyshev
> -mg_levels_pc_type sor
> -mg_levels_ksp_max_it 2
>
> Unfortunately, the preconditioning seems to run extremely slowly. I tried
> to play around with the numbers, to check if I could notice some
> difference, but could not observe significant changes.
> As a comparison, the KSPSetup call with GAMG PC takes more than 10 times
> longer than completing the whole computation (preconditioning + ~400 KSP
> iterations to convergence) of the similar case using the following
> parameters :
> -ksp_type cg
> -pc_type ilu
> -pc_factor_levels 0
>
> The matrix size for my case is ~1,850,000*1,850,000 elements, with
> ~38,000,000 non-zero terms (i.e. ~20 per row). For both ILU and AMG cases I
> use matseqaij/vecseq storage (as a first step I work with only 1 MPI
> process).
>
> Is there something wrong in the parameter set I have been using?
> I understand that the preconditioning overhead with AMG is higher than
> with ILU, but I would also expect CG/GAMG to be competitive against CG/ILU,
> especially considering the relatively big problem size.
>
> For information, I am using the PETSc version built from commit
> 6840fe907c1a3d26068082d180636158471d79a2 (release branch from April 7,
> 2021).
>
> Any clue or idea would be greatly appreciated!
> Thanks for your help,
>
> Best regards,
> Milan Pelletier
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210412/e8c07065/attachment-0001.html>
More information about the petsc-users
mailing list