[petsc-users] Guidance on GAMG preconditioning

Justin Chang jychang48 at gmail.com
Thu Jun 4 10:31:44 CDT 2015


Yeah I saw his recommendation and am trying it out. But I am not sure what
most of those parameters mean. For instance:

1) What does -pc_gamg_agg_nsmooths refer to?

2) Does increase in the threshold of -pc_gamg_threshold translate to making
the coarsening faster?

3) What does -mg_levels_ksp_max_it refer to?

I am not too worried about "bad" solutions from anisotropy (because it is
driven by the velocity field, not the actual material property or mesh
orientations) and we have a work-around for it (via optimization). I am
more concerned about the time to solution especially for bigger problems.

Thanks,
Justin

On Thu, Jun 4, 2015 at 9:04 AM, Matthew Knepley <knepley at gmail.com> wrote:

> On Thu, Jun 4, 2015 at 8:12 AM, Justin Chang <jychang48 at gmail.com> wrote:
>
>> Hello everyone,
>>
>> Apologies if this sounds like a newbie question, but I am attempting to
>> play around with the gamg preconditioner for my anisotropic diffusion
>> problem, but I have no idea how to "fine tune" the parameters such that I
>> get better performance. I understand that this depends on both the material
>> properties and the type of mesh used, but I guess for starters, here is
>> what I am doing and some stuff I have noticed:
>>
>> - I am solving a unit cube with a small hole in the middle. The outside
>> BC conditions are 0 and the inside is unity. I have a tensorial dispersion
>> diffusivity (with constant velocity). I have 6 different sized grids to
>> solve this problem on. The problem sizes range from 36K dofs to 1M dofs. I
>> was able to solve all of them using the CG and Jacobi solver and
>> preconditioner combination.
>>
>> - When I try to solve them using CG and GAMG (I did not set any other
>> command line options) I seem to get slower wall clock times but with much
>> fewer KSP iterations. I also notice that the FLOPS/s metric is much smaller.
>>
>> - For certain meshes, my CG/GAMG solver fails to converge after 2
>> iterations due to DIVERGED_INDEFINITE_PC. This does not happen when I solve
>> this on one processor or with the CG/Jacobi solver.
>>
>> From what I have read online and through these petsc-mailing lists, it
>> sounds to me the gamg preconditioner will give better performance for nice
>> elliptic problems like the one I am solving. When I saw the SNES ex12 test
>> case 39 from builder.py, it only had -pc_type gamg. I am guessing that I
>> need to set additional command line options? If so, where should I start?
>>
>
> Mark recommends starting with
>
> -ksp_type cg
> -pc_type gamg
> -pc_gamg_agg_nsmooths 1
> -pc_gamg_threshold 0.02  #  [0 - 0.1]
> #-mg_levels_ksp_type richardson
> -mg_levels_ksp_type chebyshev
> -mg_levels_pc_type sor
> #-mg_levels_pc_type jacobi
> -mg_levels_ksp_max_it 2   # [1-8]
>
> and messing with the # stuff.
>
> 1) GAMG should definitely be more scalable than CG/ILU. You should be able
> to see this by plotting the time/iterates as a function of problem size.
> GAMG
>     should have a constant number of iterates, whereas CG should grow.
> GAMG will have higher overheads, so for smaller problem sizes CG can be
> better.
>
> 2)  The DIVERGED_INDEFINITE_PC can happen if the GAMG is non-symmetric
> due to different number of iterates or the subsolver choice. Try out
>      richardson instead of chebyshev.
>
> 3) If your coefficient is highly anisotropic, GAMG may need help making
> good coarse basis vectors. Mark has some experimental stuff for this.
> Adjusting the
>     thresholding parameter determines how fast you coarsen (you can see
> the coarse problem sizes in -ksp_view). If its too fast, convergence
> deteriorates,
>     too slow and the coarse problems are expensive.
>
>     Mark, how do you turn on heavy-edge matching?
>
>   Thanks,
>
>      Matt
>
>
>> Thanks,
>> Justin
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150604/336b9582/attachment.html>


More information about the petsc-users mailing list