[petsc-users] MatPtAP for involving MPIDENSE and MPIAIJ matrices
Jed Brown
jed at jedbrown.org
Fri Oct 16 10:25:29 CDT 2015
Hong <hzhang at mcs.anl.gov> writes:
> Jed:
>>
>>
>> > I plan to implement MatTransposeMatMult_MPIDense_MPIDense via
>> >
>> > 1. add MatTransposeMatMult_elemental_elemental()
>> > 2. C_dense = P_dense^T * B_dense
>> > via MatConvert_dense_elemental() and MatConvert_elemental_dense()
>>
>> The above involves a ton of data movement and MPIDense is a logical
>> distribution for matrices with a modest number of columns. I think I
>> would just do the local GEMM and then MPI_Reduce_scatter it.
>>
> This would work.
>
> However, I recall that you did a smart ordering which allows
> MatConvert_mpidense_elemental() uses same physical matrix storage for petsc
> and elemental,
Same storage for vectors. This is your code, but I think you'll find
that it moves the matrix entries around. (Note that Elemental [MC,MR]
is a 2D distribution while MPIDense is 1D.) Also, I think it would be
better if this particular operation did not depend on Elemental.
You could write a conversion to Elemental [VC,*], which would then match
the MPIDense distribution and thus not need to move any matrix entries.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 818 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151016/19d9cc7f/attachment.pgp>
More information about the petsc-users
mailing list