[petsc-users] Solver/Preconditioner suggestions
Matthew Knepley
knepley at gmail.com
Thu May 19 13:02:57 CDT 2022
On Thu, May 19, 2022 at 8:00 AM Alfredo J Duarte Gomez <aduarteg at utexas.edu>
wrote:
> Hello Matthew,
>
> Thank you for your suggestion on the Laplace solver.
>
> About the other systems I would say B and C are mostly dominated by
> advection, except for diffusive layers at the boundaries.
>
> A is a bit more difficult to judge, although it can be expected to have a
> very large advective component, it also has a large reactive component, and
> the diffusion dominated layers at the boundaries.
>
Is your mesh resolving the boundary layers?
Thanks,
Matt
> Thank you,
>
> -Alfredo
>
> On Thu, May 19, 2022 at 12:31 PM Matthew Knepley <knepley at gmail.com>
> wrote:
>
>> On Thu, May 19, 2022 at 7:27 AM Alfredo J Duarte Gomez <
>> aduarteg at utexas.edu> wrote:
>>
>>> Good afternoon PETSC users,
>>>
>>> I am looking for some suggestions on preconditioners/solvers.
>>>
>>> Currently, I have a custom preconditioner that solves 4 independent
>>> systems, let's call them A,B,C, and D.
>>>
>>> A is an advective, diffusive, reactive system, although due to some
>>> coefficient it is the system with the highest condition number and
>>> therefore the most difficult to solve.
>>>
>>> B and C, are more "standard" advective, diffusion, reactive systems. The
>>> condition number is not as high as A.
>>>
>>> D is simply the laplacian used to solve an elliptic Poisson equation.
>>>
>>> For more context, A,B, and C need to be recomputed about once every time
>>> step, while D is a one time cost.
>>>
>>> The problem is 2-D, sizes are on the order of 1-10 million grid points,
>>> and I am using a structured grid. These usually run on somewhere between
>>> 100-400 processors.
>>>
>>> Currently I am solving A,B, and C with the HYPRE Euclid algorithm
>>> ILU(1), and D is solved with the direct solver MUMPS.
>>>
>>> While these were very useful to get the code working, I am now trying to
>>> get better parallel scaling/efficiencies and performance. HYPRE euclid does
>>> not seem to scale super well beyond like 60 procs, and MUMPS has very large
>>> memory requirements.
>>>
>>> Does anyone have suggestions on more scalable ILU algorithms for A,B,
>>> and C, or any other good alternatives?
>>>
>>
>> Are these advectively dominated?
>>
>>
>>> From what I have read, multigrid methods are probably the best
>>> alternative for D, but I have very little experience with these and they
>>> seem to require a lot of parameters. Does anyone have pointers on a good
>>> setup for a multigrid preconditioner?
>>>
>>
>> This is the easy one. If you are using DMDA, just turn on -pc_type mg and
>> give a number of levels and it should be fine. If not, then use GAMG and it
>> should be fine, You could also use Hypre Boomeramg for this since it is
>> optimized for the 2D Laplacian.
>>
>> Thanks,
>>
>> Matt
>>
>>
>>> Thank you and have a good day,
>>>
>>> -Alfredo
>>>
>>> --
>>> Alfredo Duarte
>>> Graduate Research Assistant
>>> The University of Texas at Austin
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>
>
> --
> Alfredo Duarte
> Graduate Research Assistant
> The University of Texas at Austin
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220519/459bd043/attachment.html>
More information about the petsc-users
mailing list