[petsc-users] Guidance on GAMG preconditioning

Justin Chang jychang48 at gmail.com
Thu Jun 4 08:12:18 CDT 2015


Hello everyone,

Apologies if this sounds like a newbie question, but I am attempting to
play around with the gamg preconditioner for my anisotropic diffusion
problem, but I have no idea how to "fine tune" the parameters such that I
get better performance. I understand that this depends on both the material
properties and the type of mesh used, but I guess for starters, here is
what I am doing and some stuff I have noticed:

- I am solving a unit cube with a small hole in the middle. The outside BC
conditions are 0 and the inside is unity. I have a tensorial dispersion
diffusivity (with constant velocity). I have 6 different sized grids to
solve this problem on. The problem sizes range from 36K dofs to 1M dofs. I
was able to solve all of them using the CG and Jacobi solver and
preconditioner combination.

- When I try to solve them using CG and GAMG (I did not set any other
command line options) I seem to get slower wall clock times but with much
fewer KSP iterations. I also notice that the FLOPS/s metric is much smaller.

- For certain meshes, my CG/GAMG solver fails to converge after 2
iterations due to DIVERGED_INDEFINITE_PC. This does not happen when I solve
this on one processor or with the CG/Jacobi solver.

>From what I have read online and through these petsc-mailing lists, it
sounds to me the gamg preconditioner will give better performance for nice
elliptic problems like the one I am solving. When I saw the SNES ex12 test
case 39 from builder.py, it only had -pc_type gamg. I am guessing that I
need to set additional command line options? If so, where should I start?

Thanks,
Justin
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150604/0bca3d04/attachment.html>


More information about the petsc-users mailing list