[petsc-users] Overcoming slow convergence with GMRES+Hypre BoomerAMG

Christopher, Joshua jchristopher at anl.gov
Thu Mar 2 15:22:38 CST 2023

Hi Barry and Mark,

Thank you for looking into my problem. The two equations I am solving with PETSc are equations 6 and 7 from this paper: https://ris.utwente.nl/ws/portalfiles/portal/5676495/Roghair+Paper_final_draft_v1.pdf

I just used MUMPS and SuperLU_DIST on my full-size problem (with 3,000,000 unknowns). To clarify, I did a direct solve with -ksp_type preonly. They take a very long time, about 30 minutes for MUMPS and 18 minutes for SuperLU_DIST, see attached output. For reference, the same matrix took 658 iterations of BoomerAMG and about 20 seconds of walltime. Maybe I am already getting a great deal with BoomerAMG!

I'll try removing some terms from my solve (e.g. removing the second equation, then making the second equation just the elliptic portion of the equation, etc.) and try with a simpler geometry. I'll keep you updated as I run into troubles with that route. I wasn't aware of Field Split preconditioners, I'll do some reading on them and give them a try as well.

Thank you again,
From: Barry Smith <bsmith at petsc.dev>
Sent: Thursday, March 2, 2023 7:47 AM
To: Christopher, Joshua <jchristopher at anl.gov>
Cc: petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Overcoming slow convergence with GMRES+Hypre BoomerAMG

  Have you tried MUMPS (or SuperLU_DIST) on the full-size problem with the 5,000,000 unknowns? It is at the high end of problem sizes you can do with direct solvers but is worth comparing with  BoomerAMG. You likely want to use more nodes and fewer cores per node with MUMPs to be able to access more memory. If you are needing to solve multiple right hand sides but with the same matrix the factors will be reused resulting in the second and later solves being much faster.

  I agree with Mark, with iterative solvers you are likely to end up with PCFIELDSPLIT.


On Mar 1, 2023, at 7:17 PM, Christopher, Joshua via petsc-users <petsc-users at mcs.anl.gov> wrote:


I am trying to solve the leaky-dielectric model equations with PETSc using a second-order discretization scheme (with limiting to first order as needed) using the finite volume method. The leaky dielectric model is a coupled system of two equations, consisting of a Poisson equation and a convection-diffusion equation.  I have tested on small problems with simple geometry (~1000 DoFs) using:

-ksp_type gmres
-pc_type hypre
-pc_hypre_type boomeramg

and I get RTOL convergence to 1.e-5 in about 4 iterations. I tested this in parallel with 2 cores, but also previously was able to use successfully use a direct solver in serial to solve this problem. When I scale up to my production problem, I get significantly worse convergence. My production problem has ~3 million DoFs, more complex geometry, and is solved on ~100 cores across two nodes. The boundary conditions change a little because of the geometry, but are of the same classifications (e.g. only Dirichlet and Neumann). On the production case, I am needing 600-4000 iterations to converge. I've attached the output from the first solve that took 658 iterations to converge, using the following output options:


My matrix is non-symmetric, the condition number can be around 10e6, and the eigenvalues reported by PETSc have been real and positive (using -ksp_view_eigenvalues).

I have tried using other preconditions (superlu, mumps, gamg, mg) but hypre+boomeramg has performed the best so far. The literature seems to indicate that AMG is the best approach for solving these equations in a coupled fashion.

Do you have any advice on speeding up the convergence of this system?

Thank you,

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230302/75f03a69/attachment-0001.html>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: petsc_preonly_mumps.txt
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230302/75f03a69/attachment-0002.txt>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: petsc_preonly_superlu.txt
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230302/75f03a69/attachment-0003.txt>

More information about the petsc-users mailing list