[petsc-users] [petsc-maint] Monolithic AMG with fieldsplit as smoother

Mark Adams mfadams at lbl.gov
Thu Jul 27 08:50:19 CDT 2023


I would not worry about the null space (if you have elasticity or the
equivalent use hypre for now) and the strength of connections, is not very
useful in my experience (it is confounded my high order and no one has
bothered to deploy a fancy strength of connections method in a library that
I know of). If you have anisotropies or material discontinuities, honestly
AMG does not do as well as advertised. That said, we could talk after you
get up and running.

If your problems are very hard then as Matt said, old fashioned geometric
MG using modern unstructured (FE) discretizations and mesh management, is
something to consider. PETSc has support for this and we are actively using
and developing support for this. Antony Jameson has been doing this for
decades here is an example of a new project doing something like this:
https://arxiv.org/abs/2307.04528. Tobin Isaac, in PETSc, and many others
have done things like this, but they tend to be customized for an
application whereas AMG strives to be general.

Mark

On Thu, Jul 27, 2023 at 1:10 AM Matthew Knepley <knepley at gmail.com> wrote:

> On Thu, Jul 27, 2023 at 12:48 AM Jed Brown <jed at jedbrown.org> wrote:
>
>> AMG is subtle here. With AMG for systems, you typically feed it elements
>> of the near null space. In the case of (smoothed) aggregation, the coarse
>> space will have a regular block structure with block sizes equal to the
>> number of near-null vectors. You can use pc_fieldsplit options to select
>> which fields you want in each split.
>>
>> However, AMG also needs a strength of connection and if your system is so
>> weird you need to fieldsplit the smoothers (e.g., a saddle point problem or
>> a hyperbolic system) then it's likely that you'll also need a custom
>> strength of connection to obtain reasonable coarsening.
>>
>
> For this reason, sometimes GMG is easier for systems since you just
> rediscretize.
>
>   Thanks,
>
>      Matt
>
>
>> Barry Smith <bsmith at petsc.dev> writes:
>>
>> >   See the very end of the section
>> https://petsc.org/release/manual/ksp/#multigrid-preconditioners on how
>> to control the smoothers (and coarse grid solve) for multigrid in PETSc
>> including for algebraic multigrid.
>> >
>> >    So, for example, -mg_levels_pc_type fieldsplit would be the starting
>> point. Depending on the block size of the matrices it may automatically do
>> simple splits, you can control the details  of the fieldsplit
>> preconditioner with -mg_levels_pc_fieldsplit_...  and the details for each
>> split with -mg_levels_fieldsplit_....
>> >
>> >    See src/ksp/ksp/tutorials/ex42.c for example, usage
>> >
>> >    Feel free to ask more specific questions once you get started.
>> >
>> >> On Jul 26, 2023, at 9:47 PM, Michael Wick <michael.wick.1980 at gmail.com>
>> wrote:
>> >>
>> >> Hello PETSc team:
>> >>
>> >> I wonder if the current PETSc implementation supports using AMG
>> monolithically for a multi-field problem and using fieldsplit in the
>> smoother.
>> >>
>> >> Thank you very much,
>> >>
>> >> Mike
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230727/ac4a1100/attachment.html>


More information about the petsc-users mailing list