[petsc-dev] ASM for each field solve on GPUs

Jed Brown jed at jedbrown.org
Wed Dec 30 19:31:48 CST 2020


Mark Adams <mfadams at lbl.gov> writes:

> On Wed, Dec 30, 2020 at 6:47 PM Jed Brown <jed at jedbrown.org> wrote:
>
>> Mark Adams <mfadams at lbl.gov> writes:
>>
>> > I see that ASM has a DM and can get subdomains from it. I have a DMForest
>> > and I would like an ASM that has a subdomain for each field. How might I
>> go
>> > about doing this? (the fields are not coupled in the matrix so this would
>> > give a block diagonal matrix, and thus exact with LU sub solvers.
>>
>> The fields are already not coupled or you want to filter the matrix and
>> give back a single matrix with coupling removed?
>>
>
> They are not coupled. It is is a block diagonal matrix but not stored that
> way by Plex.

Just use your direct solver. So long as the zeros aren't stored, it sees the parallelism (because ND shows they're decoupled).

>> You can use Fieldsplit to get the math of field-based block Jacobi (or
>> ASM, but overlap with fields tends to be expensive). Neither FieldSplit or
>> ASM can run the (additive) solves concurrently (and most libraries would
>> need something to drive the threads).
>>
>
> No overlap, MPI serial. Very simple. I just want a direct solver and I want
> to exploit the 10-way parallelism that is just sitting there.
>
> And I see that PCApply_ASM would need some very particular non-blocking
> semantics in KSPSolve and VecScatter to work so I would assume that a new
> driver would be required or just rearrange the current one a bit if you
> don't mind losing a bit of cache reuse in doing each block all at once.
>
>
>>
>> > I am then going to want to get these separate solves to be run in
>> parallel
>> > on a GPU (I'm talking with Sherry about getting SuperLU working on these
>> > small problems). In looking at PCApply_ASM it looks like this will take
>> > some thought. KSPSolve would need to be non-blocking, etc., or a new
>> apply
>> > op might be needed.
>> >
>> > Thanks,
>> > Mark
>>


More information about the petsc-dev mailing list