[petsc-users] Customizeing MatSetValuesBlocked(...)

Jed Brown jedbrown at mcs.anl.gov
Wed Aug 8 16:04:42 CDT 2012


On Wed, Aug 8, 2012 at 2:53 PM, Jinquan Zhong <jzhong at scsolutions.com>wrote:

> The dense matrix is predominantly so big ( 800GB) that it would cost too
> much to move it around in the distributed memory.  What I tried to do would
> be to retain the 800GB already in core from ScaLAPACK untouched and
> repartition their indices.  The next step would be to repartition the
> indices for each block after I complete the construction of the sparse
> matrix.  This sparse matrix is mainly populated with 800GB entries from
> dense matrix, along with about 3 GB entries from other matrices.  ****
>
> ** **
>
> Had it be true as you said, wouldn’t that be a lot of exchange of data
> among distributed memory?  Or I am missing something huge.
>

You can change indexing so that Elemental's matrix uses the same memory.
PETSc uses yet another indexing. You can look at
petsc-dev/src/mat/impls/elemental/matelem.cxx for the index translation
between the ordering PETSc presents to the user and the one Elemental uses
internally. Anyway, if you configure an Elemental matrix to use your
memory, while the (simpler) PETSc ordering has contiguous vectors [1], then
you can combine it with traditional sparse matrices using MATNEST.

[1] Note that PETSc's ownership ranges really fundamentally refer to the
Vec distribution; the matrix entries can be stored elsewhere.


There is nothing especially hard about doing the above, but you will have
to do some careful indexing math. Look at the MATNEST man page and the
Elemental documentation.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120808/f4ee69a7/attachment-0001.html>


More information about the petsc-users mailing list