[petsc-users] Creating a larger matrix from submatrices

Anush Krishnan anush at bu.edu
Sat Nov 9 20:02:05 CST 2013


On 8 November 2013 22:22, Jed Brown <jedbrown at mcs.anl.gov> wrote:

> > I would like to create matrix G and vector R such that
> >
> > G*P = R
> >
> > where
> >
> > G = /Gx\
> >     \Gy/
> > and
> >
> > R = /Rx\
> >     \Ry/
> >
> > and vector R is created from the DMComposite of the DAs of U and V.
>
> DMCreateGlobalVector() gives you the vector R.  Then use MatCreate() and
> MatSetSizes() where the local and global row sizes match R and the
> column sizes match P.  For your staggered grid, you should be able to
> preallocate simply by using MatSeqAIJSetPreallocation() and
> MatMPIAIJSetPreallocation() with a constant row size of 2 for the
> "diagonal block" and 1 in the off-diagonal block.  (You can also
> preallocate exactly to be more accurate with memory, but may as well
> know you are assembling the correct thing first.)
>
> You can call DMCompositeGetGlobalISs() to get index sets holding the
> global indices for the U and V rows of your matrix, or just compute them
> From the ownership range and the sizes.  The column indices will come
> From the DMDA that holds P.
>


Hi Jed,

Thank you very much! That worked.

I also have a couple of general questions regarding this:

1. In this case, I followed your advice and created a single matrix G of
type MPIAIJ and assembled it. How do I decide if I should rather use Block
Matrices (MPIBAIJ) or MatGetLocalSubMatrix to set it up?

2. Various functions exist to help us access the correct indices when
required (e.g. DMDAGetAO, DMCompositeGetGlobalISs, DMDAGetCorners,
VecGetOwnershipRange, etc.) And sometimes, there are multiple ways by which
we can obtain the same indices. In such cases, are some of these functions
preferable to others (either because they are faster or need to allocate
less memory)?

Thank you,
Anush
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131109/160a0e48/attachment.html>


More information about the petsc-users mailing list