[petsc-dev] Handling ADD_VALUES with DMComposite

Jed Brown jed at 59A2.org
Wed Nov 17 09:26:11 CST 2010


On Tue, Nov 16, 2010 at 21:14, Jed Brown <jed at 59a2.org> wrote:

> discuss matrix insertion in the multi-physics context.


I just had a chat with Dave about this, and I realize that some of my early
ideas are crappy for block formats.  Suppose I have a matrix logically
partitioned as J = [A B; C D].  I would like to either (1) hold all of J as
MPIAIJ or (2) hold J as MatBlock (need an MPI implementation, maybe that is
MatDD, maybe it is based on Dave's implementation in PETSc-Ext) where A and
D are (S)BAIJ.  The case of A and D on sub-communicators is interesting, but
should be possible eventually.

So what if I could get lightweight references to the blocks with an API like
MatGetSubMatrices.  In the MatBlock case, it would just give me references
to the pieces, but in the AIJ case, it would give me a Mat of a new type
that only implements MatSetValues (and variants).  This implementation would
end up just calling MatSetValues(J,...) with fixed up indices.

The only downside of this is that the user would have to make a separate
MatSetValues call for each part, which is no performance penalty when J is
MatBlock, but is somewhat inefficient when J is AIJ.  I think this is an
acceptable penalty since it allows BlockMat pieces of A to be stored in
better formats (S)BAIJ.  It also exposes some concurrency if assembly was
parallelized with threads (in which case MatSetValues needs to hold locks).


In a similar direction, I think that DMComposite's subvector monkeying (via
VecPlaceArray) could be replaced by a Vec API to get pieces.  This could do
VecPlaceArray for PETSc's current Vec types, but it would return the
(no-copy) pieces for a VecBlock.  This would permit the pieces to have type
VecGhost, for example, thus allowing no-copy from global multi-physics
vector to ghosted single-physics vectors.  Of course users could not use
VecBlock with MPIAIJ (or it would need an extra copy), but this would seem
to permit a fast implementation in both the fully assembled and heavily
split cases.

Jed
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20101117/5009e5a7/attachment.html>


More information about the petsc-dev mailing list