[petsc-dev] ASM for each field solve on GPUs

Mark Adams mfadams at lbl.gov
Wed Dec 30 18:26:17 CST 2020


On Wed, Dec 30, 2020 at 6:47 PM Jed Brown <jed at jedbrown.org> wrote:

> Mark Adams <mfadams at lbl.gov> writes:
>
> > I see that ASM has a DM and can get subdomains from it. I have a DMForest
> > and I would like an ASM that has a subdomain for each field. How might I
> go
> > about doing this? (the fields are not coupled in the matrix so this would
> > give a block diagonal matrix, and thus exact with LU sub solvers.
>
> The fields are already not coupled or you want to filter the matrix and
> give back a single matrix with coupling removed?
>

They are not coupled. It is is a block diagonal matrix but not stored that
way by Plex.


>
> You can use Fieldsplit to get the math of field-based block Jacobi (or
> ASM, but overlap with fields tends to be expensive). Neither FieldSplit or
> ASM can run the (additive) solves concurrently (and most libraries would
> need something to drive the threads).
>

No overlap, MPI serial. Very simple. I just want a direct solver and I want
to exploit the 10-way parallelism that is just sitting there.

And I see that PCApply_ASM would need some very particular non-blocking
semantics in KSPSolve and VecScatter to work so I would assume that a new
driver would be required or just rearrange the current one a bit if you
don't mind losing a bit of cache reuse in doing each block all at once.


>
> > I am then going to want to get these separate solves to be run in
> parallel
> > on a GPU (I'm talking with Sherry about getting SuperLU working on these
> > small problems). In looking at PCApply_ASM it looks like this will take
> > some thought. KSPSolve would need to be non-blocking, etc., or a new
> apply
> > op might be needed.
> >
> > Thanks,
> > Mark
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20201230/c4510ad3/attachment.html>


More information about the petsc-dev mailing list