[petsc-users] MatPtAP for involving MPIDENSE and MPIAIJ matrices

Hong hzhang at mcs.anl.gov
Wed Oct 21 10:42:33 CDT 2015


Bikash,
I implemented MatTransposeMatMult_MPIDense_MPIDense()
in the branch hzhang/mattransmatmult_dense.
Once it passes our nightly tests, I'll merge it to petsc master branch.

See petsc/src/mat/examples/tests/ex104.c, in which I added a simple
example doing
B = P^T * A * P, where P is an MPIDENSE matrix and A is an MPIAIJ matrix.

Let us know if you see any bug or performance issues.

Hong

On Fri, Oct 16, 2015 at 10:25 AM, Jed Brown <jed at jedbrown.org> wrote:

> Hong <hzhang at mcs.anl.gov> writes:
>
> > Jed:
> >>
> >>
> >> > I plan to implement MatTransposeMatMult_MPIDense_MPIDense via
> >> >
> >> > 1. add MatTransposeMatMult_elemental_elemental()
> >> > 2. C_dense = P_dense^T * B_dense
> >> >     via MatConvert_dense_elemental() and MatConvert_elemental_dense()
> >>
> >> The above involves a ton of data movement and MPIDense is a logical
> >> distribution for matrices with a modest number of columns.  I think I
> >> would just do the local GEMM and then MPI_Reduce_scatter it.
> >>
> > This would work.
> >
> > However, I recall that you did a smart ordering which allows
> > MatConvert_mpidense_elemental() uses same physical matrix storage for
> petsc
> > and elemental,
>
> Same storage for vectors.  This is your code, but I think you'll find
> that it moves the matrix entries around.  (Note that Elemental [MC,MR]
> is a 2D distribution while MPIDense is 1D.)  Also, I think it would be
> better if this particular operation did not depend on Elemental.
>
> You could write a conversion to Elemental [VC,*], which would then match
> the MPIDense distribution and thus not need to move any matrix entries.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151021/eadd0fd4/attachment.html>


More information about the petsc-users mailing list