[petsc-dev] MatMatMult
Pierre Jolivet
Pierre.Jolivet at enseeiht.fr
Tue Jun 6 11:42:55 CDT 2017
On Tue, 6 Jun 2017 10:57:10 -0500, Hong wrote:
> Pierre:
>
>> Hello,
>> I'm implementing MatMatMult for A of type MPIBAIJ, B and C of type
>> MPIDENSE.
>> 1) I copied around some code for the case where A is of type
>> MPIAIJ. Any reason why communications and computations are not
>> overlapped in the MatMatMult implementation?
>>
>
> http://www.mcs.anl.gov/petsc/petsc-current/src/mat/impls/aij/mpi/mpimatmatmult.c.html#line556
>> [1]
>
> Do you mean overlapping
> MPI_Isend()/MPI_Waitany() in MatMPIDenseScatter()
>
> with local call MatMatMultNumeric_SeqAIJ_SeqDense() ?
Yes.
> I guess you can implement overlapping. If it gives better
> performance, you can improve MatMatMultNumeric_MPIAIJ_MPIDense().
OK, I was wondering if you had done some benchmarking, but I guess I'll
do it.
Yes, of course I defined (*C)->ops->matmultnumeric =
MatMatMultNumeric_MPIBAIJ_MPIDense in
MatMatMultSymbolic_MPIBAIJ_MPIDense.
However, the routine MatMatMultSymbolic_MPIBAIJ_MPIDense is never
reached when calling MatMatMult with scall == MAT_REUSE_MATRIX
(http://www.mcs.anl.gov/petsc/petsc-current/src/mat/interface/matrix.c.html#line9487,
just to be sure I added a dummy printf in
MatMatMultSymbolic_MPIBAIJ_MPIDense and nothing is displayed with
MAT_REUSE_MATRIX, MAT_INITIAL_MATRIX works as intended)
Any other suggestion?
Thank you for your help,
Pierre
>> 2) I'm having trouble when scall == MAT_REUSE_MATRIX. Here,
>>
>
> http://www.mcs.anl.gov/petsc/petsc-current/src/mat/impls/dense/mpi/mpidense.c.html#line1208
>> [2] it looks that the numeric part of the MatMatMult (which is
>> called when scall == MAT_REUSE_MATRIX) is hardwired to this routine
>>
>
> http://www.mcs.anl.gov/petsc/petsc-current/src/mat/impls/aij/mpi/mpimatmatmult.c.html#line376
>> [3].
> Thus, at runtime, this call fails
>
> Links:
> ------
> [1]
>
> http://www.mcs.anl.gov/petsc/petsc-current/src/mat/impls/aij/mpi/mpimatmatmult.c.html#line556
> [2]
>
> http://www.mcs.anl.gov/petsc/petsc-current/src/mat/impls/dense/mpi/mpidense.c.html#line1208
> [3]
>
> http://www.mcs.anl.gov/petsc/petsc-current/src/mat/impls/aij/mpi/mpimatmatmult.c.html#line376
More information about the petsc-dev
mailing list