[petsc-dev] PCASM: subdomains spanning multiple processes
Barry Smith
bsmith at mcs.anl.gov
Tue Apr 6 16:12:31 CDT 2010
On Apr 6, 2010, at 4:04 PM, Jed Brown wrote:
> On Tue, 6 Apr 2010 12:57:28 -0500, Barry Smith <bsmith at mcs.anl.gov>
> wrote:
>>
>> We should have this for both ASM and block Jacobi. Not
>> particularly difficult to write some code for it,
>>
>> BUT: the resulting morass of code would be nightmarish: three
>> versions of ASM code for (1) one block per process, (2) several
>> blocks
>> per process (3) partial block per process, similar for block Jacobi.
>> It would be good if we could have a single efficient code base for
>> all
>> the variants, but that may be difficult/impossible to design. We may
>> just need to end up with a morass of code.
>
> Is there any reason for block Jacobi to be separate code from ASM
> overlap 0?
Not necessarily. Maybe just historical and we could remove some
code from PETSc in cleaning this up :-)
> Note that the present code has no special cases to
> distinguish (1) and (2) above.
There is PCSetUp_BJacobi_Singleblock()
PCSetUp_BJacobi_Multiblock(), destroy single block and multiblock()
and apply single block and multiblock for bjacobi most of this is to
handle the special case where one does not need to make a COPY of the
matrix (since [SB]AIJ matrices have the "diagonal" block already).
Maybe this can be vastly simplified but the code has to still NOT copy
the matrix when it doesn't need to.
Barry
>
> My thought would be that PCASMSetLocalSubdomains would accept IS
> defined
> on a communicator other than PETSC_COMM_SELF (the subdomains would be
> built on the provided communicator). I'm not sure if
> MatIncreaseOverlap
> already supports parallel index sets, MatGetSubmatrix could be
> modified
> to extract submatrices on the communicator associated with the index
> sets (instead of the communicator from the original matrix as it
> currently does).
>
> Jed
More information about the petsc-dev
mailing list