[petsc-users] Overcoming slow convergence with GMRES+Hypre BoomerAMG
Barry Smith
bsmith at petsc.dev
Thu Mar 2 07:47:21 CST 2023
Have you tried MUMPS (or SuperLU_DIST) on the full-size problem with the 5,000,000 unknowns? It is at the high end of problem sizes you can do with direct solvers but is worth comparing with BoomerAMG. You likely want to use more nodes and fewer cores per node with MUMPs to be able to access more memory. If you are needing to solve multiple right hand sides but with the same matrix the factors will be reused resulting in the second and later solves being much faster.
I agree with Mark, with iterative solvers you are likely to end up with PCFIELDSPLIT.
Barry
> On Mar 1, 2023, at 7:17 PM, Christopher, Joshua via petsc-users <petsc-users at mcs.anl.gov> wrote:
>
> Hello,
>
> I am trying to solve the leaky-dielectric model equations with PETSc using a second-order discretization scheme (with limiting to first order as needed) using the finite volume method. The leaky dielectric model is a coupled system of two equations, consisting of a Poisson equation and a convection-diffusion equation. I have tested on small problems with simple geometry (~1000 DoFs) using:
>
> -ksp_type gmres
> -pc_type hypre
> -pc_hypre_type boomeramg
>
> and I get RTOL convergence to 1.e-5 in about 4 iterations. I tested this in parallel with 2 cores, but also previously was able to use successfully use a direct solver in serial to solve this problem. When I scale up to my production problem, I get significantly worse convergence. My production problem has ~3 million DoFs, more complex geometry, and is solved on ~100 cores across two nodes. The boundary conditions change a little because of the geometry, but are of the same classifications (e.g. only Dirichlet and Neumann). On the production case, I am needing 600-4000 iterations to converge. I've attached the output from the first solve that took 658 iterations to converge, using the following output options:
>
> -ksp_view_pre
> -ksp_view
> -ksp_converged_reason
> -ksp_monitor_true_residual
> -ksp_test_null_space
>
> My matrix is non-symmetric, the condition number can be around 10e6, and the eigenvalues reported by PETSc have been real and positive (using -ksp_view_eigenvalues).
>
> I have tried using other preconditions (superlu, mumps, gamg, mg) but hypre+boomeramg has performed the best so far. The literature seems to indicate that AMG is the best approach for solving these equations in a coupled fashion.
>
> Do you have any advice on speeding up the convergence of this system?
>
> Thank you,
> Joshua
> <petsc_gmres_boomeramg.txt>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230302/c0a27e59/attachment.html>
More information about the petsc-users
mailing list