[petsc-users] Overcoming slow convergence with GMRES+Hypre BoomerAMG

Christopher, Joshua jchristopher at anl.gov
Fri Mar 3 11:24:32 CST 2023


I am solving these equations in the context of electrically-driven fluid flows as that first paper describes. I am using a PIMPLE scheme to advance the fluid equations in time, and my goal is to do a coupled solve of the electric equations similar to what is described in this paper: https://www.sciencedirect.com/science/article/pii/S0045793019302427. They are using the SIMPLE scheme in this paper. My fluid flow should eventually reach steady behavior, and likewise the time derivative in the charge density should trend towards zero. They preferred using BiCGStab with a direct LU preconditioner for solving their electric equations. I tried to test that combination, but my case is halting for unknown reasons in the middle of the PETSc solve. I'll try with more nodes and see if I am running out of memory, but the computer is a little overloaded at the moment so it may take a while to run.

I sent Pierre Jolivet my matrix and RHS, and they said the matrix does not appear to be following a parallel numbering, and instead looks like the matrix has natural numbering. When they renumbered the system with ParMETIS they got really fast convergence. I am using PETSc through a library, so I will reach out to the library authors and see if there is an issue in the library.

Thank you,
Joshua
________________________________
From: Barry Smith <bsmith at petsc.dev>
Sent: Thursday, March 2, 2023 3:47 PM
To: Christopher, Joshua <jchristopher at anl.gov>
Cc: petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Overcoming slow convergence with GMRES+Hypre BoomerAMG




[Untitled.png]

  Are you solving this as a time-dependent problem? Using an implicit scheme (like backward Euler) for rho ? In ODE language, solving the differential algebraic equation?

Is epsilon bounded away from 0?

On Mar 2, 2023, at 4:22 PM, Christopher, Joshua <jchristopher at anl.gov> wrote:

Hi Barry and Mark,

Thank you for looking into my problem. The two equations I am solving with PETSc are equations 6 and 7 from this paper:https://ris.utwente.nl/ws/portalfiles/portal/5676495/Roghair+Paper_final_draft_v1.pdf

I just used MUMPS and SuperLU_DIST on my full-size problem (with 3,000,000 unknowns). To clarify, I did a direct solve with -ksp_type preonly. They take a very long time, about 30 minutes for MUMPS and 18 minutes for SuperLU_DIST, see attached output. For reference, the same matrix took 658 iterations of BoomerAMG and about 20 seconds of walltime. Maybe I am already getting a great deal with BoomerAMG!

I'll try removing some terms from my solve (e.g. removing the second equation, then making the second equation just the elliptic portion of the equation, etc.) and try with a simpler geometry. I'll keep you updated as I run into troubles with that route. I wasn't aware of Field Split preconditioners, I'll do some reading on them and give them a try as well.

Thank you again,
Joshua
________________________________
From: Barry Smith <bsmith at petsc.dev>
Sent: Thursday, March 2, 2023 7:47 AM
To: Christopher, Joshua <jchristopher at anl.gov>
Cc: petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Overcoming slow convergence with GMRES+Hypre BoomerAMG


  Have you tried MUMPS (or SuperLU_DIST) on the full-size problem with the 5,000,000 unknowns? It is at the high end of problem sizes you can do with direct solvers but is worth comparing with  BoomerAMG. You likely want to use more nodes and fewer cores per node with MUMPs to be able to access more memory. If you are needing to solve multiple right hand sides but with the same matrix the factors will be reused resulting in the second and later solves being much faster.

  I agree with Mark, with iterative solvers you are likely to end up with PCFIELDSPLIT.

  Barry


On Mar 1, 2023, at 7:17 PM, Christopher, Joshua via petsc-users <petsc-users at mcs.anl.gov> wrote:

Hello,

I am trying to solve the leaky-dielectric model equations with PETSc using a second-order discretization scheme (with limiting to first order as needed) using the finite volume method. The leaky dielectric model is a coupled system of two equations, consisting of a Poisson equation and a convection-diffusion equation.  I have tested on small problems with simple geometry (~1000 DoFs) using:

-ksp_type gmres
-pc_type hypre
-pc_hypre_type boomeramg

and I get RTOL convergence to 1.e-5 in about 4 iterations. I tested this in parallel with 2 cores, but also previously was able to use successfully use a direct solver in serial to solve this problem. When I scale up to my production problem, I get significantly worse convergence. My production problem has ~3 million DoFs, more complex geometry, and is solved on ~100 cores across two nodes. The boundary conditions change a little because of the geometry, but are of the same classifications (e.g. only Dirichlet and Neumann). On the production case, I am needing 600-4000 iterations to converge. I've attached the output from the first solve that took 658 iterations to converge, using the following output options:

-ksp_view_pre
-ksp_view
-ksp_converged_reason
-ksp_monitor_true_residual
-ksp_test_null_space

My matrix is non-symmetric, the condition number can be around 10e6, and the eigenvalues reported by PETSc have been real and positive (using -ksp_view_eigenvalues).

I have tried using other preconditions (superlu, mumps, gamg, mg) but hypre+boomeramg has performed the best so far. The literature seems to indicate that AMG is the best approach for solving these equations in a coupled fashion.

Do you have any advice on speeding up the convergence of this system?

Thank you,
Joshua
<petsc_gmres_boomeramg.txt>

<petsc_preonly_mumps.txt><petsc_preonly_superlu.txt>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230303/6ee37f0e/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Untitled.png
Type: image/png
Size: 165137 bytes
Desc: Untitled.png
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230303/6ee37f0e/attachment-0001.png>


More information about the petsc-users mailing list