[petsc-users] MatPtAP for involving MPIDENSE and MPIAIJ matrices
Hong
hzhang at mcs.anl.gov
Fri Oct 16 10:12:47 CDT 2015
Jed:
>
>
> > I plan to implement MatTransposeMatMult_MPIDense_MPIDense via
> >
> > 1. add MatTransposeMatMult_elemental_elemental()
> > 2. C_dense = P_dense^T * B_dense
> > via MatConvert_dense_elemental() and MatConvert_elemental_dense()
>
> The above involves a ton of data movement and MPIDense is a logical
> distribution for matrices with a modest number of columns. I think I
> would just do the local GEMM and then MPI_Reduce_scatter it.
>
This would work.
However, I recall that you did a smart ordering which allows
MatConvert_mpidense_elemental() uses same physical matrix storage for petsc
and elemental, but logically in the layout of elemental. As an example,
petsc/src/mat/examples/tests/ex103.c:
mpiexec -n 6 ./ex103
Outplace MatConvert, A_elemental:
Mat Object: 6 MPI processes
type: elemental
Elemental matrix (cyclic ordering)
0 0 0 0 0
1 1 1 1 1
2 2 2 2 2
3 3 3 3 3
4 4 4 4 4
5 5 5 5 5
0 0 0 0 0
1 1 1 1 1
2 2 2 2 2
3 3 3 3 3
Elemental matrix (explicit ordering)
Mat Object: 6 MPI processes
type: mpidense
0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
0.0000000000000000e+00 0.0000000000000000e+00
0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
0.0000000000000000e+00 0.0000000000000000e+00
1.0000000000000000e+00 1.0000000000000000e+00 1.0000000000000000e+00
1.0000000000000000e+00 1.0000000000000000e+00
1.0000000000000000e+00 1.0000000000000000e+00 1.0000000000000000e+00
1.0000000000000000e+00 1.0000000000000000e+00
2.0000000000000000e+00 2.0000000000000000e+00 2.0000000000000000e+00
2.0000000000000000e+00 2.0000000000000000e+00
2.0000000000000000e+00 2.0000000000000000e+00 2.0000000000000000e+00
2.0000000000000000e+00 2.0000000000000000e+00
3.0000000000000000e+00 3.0000000000000000e+00 3.0000000000000000e+00
3.0000000000000000e+00 3.0000000000000000e+00
3.0000000000000000e+00 3.0000000000000000e+00 3.0000000000000000e+00
3.0000000000000000e+00 3.0000000000000000e+00
4.0000000000000000e+00 4.0000000000000000e+00 4.0000000000000000e+00
4.0000000000000000e+00 4.0000000000000000e+00
5.0000000000000000e+00 5.0000000000000000e+00 5.0000000000000000e+00
5.0000000000000000e+00 5.0000000000000000e+00
i.e., elemental and petsc dense matrices have same ownership.
If there is no data movement for MatConvert(), then it would be easier to
use elemental.
Hong
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151016/4cd3d58d/attachment.html>
More information about the petsc-users
mailing list