[petsc-dev] PCASM: subdomains spanning multiple processes

Jed Brown jed at 59A2.org
Tue Apr 6 16:04:12 CDT 2010


On Tue, 6 Apr 2010 12:57:28 -0500, Barry Smith <bsmith at mcs.anl.gov> wrote:
> 
>     We should have this for both ASM and block Jacobi. Not  
> particularly difficult to write some code for it,
> 
>     BUT: the resulting morass of code would be nightmarish: three  
> versions of ASM code for (1) one block per process, (2) several blocks  
> per process (3) partial block per process, similar for block Jacobi.   
> It would be good if we could have a single efficient code base for all  
> the variants, but that may be difficult/impossible to design. We may  
> just need to end up with a morass of code.

Is there any reason for block Jacobi to be separate code from ASM
overlap 0?  Note that the present code has no special cases to
distinguish (1) and (2) above.

My thought would be that PCASMSetLocalSubdomains would accept IS defined
on a communicator other than PETSC_COMM_SELF (the subdomains would be
built on the provided communicator).  I'm not sure if MatIncreaseOverlap
already supports parallel index sets, MatGetSubmatrix could be modified
to extract submatrices on the communicator associated with the index
sets (instead of the communicator from the original matrix as it
currently does).

Jed



More information about the petsc-dev mailing list