[petsc-dev] MatMatMult
Zhang, Hong
hzhang at mcs.anl.gov
Wed Jun 7 22:14:53 CDT 2017
Pierre:
You must call MatMatMultSymbolic_MPIBAIJ_MPIDense() to create a dense matrix C (=A*B) and an internal submatrix 'workB'.
'workB' will be used (and reused) to store off processor rows of B needed for local product of C.
See MatMatMultSymbolic_MPIAIJ_MPIDense().
Without this local storage, numeric product fails as shown from your test.
Again, for using parallel MatMatMult(), you must use MatMatMultSymbolic() to create C, then reuse it.
Hong
________________________________________
From: Barry Smith [bsmith at mcs.anl.gov]
Sent: Wednesday, June 07, 2017 9:31 PM
To: Pierre Jolivet
Cc: Zhang, Hong; For users of the development version of PETSc
Subject: Re: [petsc-dev] MatMatMult
> On Jun 7, 2017, at 1:29 AM, Pierre Jolivet <pierre.jolivet at enseeiht.fr> wrote:
>
>>
>> On 6 Jun 2017, at 11:21 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>
>>>
>>> On Jun 6, 2017, at 1:52 PM, Pierre Jolivet <Pierre.Jolivet at enseeiht.fr> wrote:
>>>
>>> On Tue, 6 Jun 2017 13:44:06 -0500, Hong wrote:
>>>> Pierre :
>>>>
>>>>> Yes, of course I defined (*C)->ops->matmultnumeric =
>>>>> MatMatMultNumeric_MPIBAIJ_MPIDense in
>>>>> MatMatMultSymbolic_MPIBAIJ_MPIDense.
>>>>> However, the routine MatMatMultSymbolic_MPIBAIJ_MPIDense is never
>>>>> reached when calling MatMatMult with scall == MAT_REUSE_MATRIX
>>>>>
>>>> (http://www.mcs.anl.gov/petsc/petsc-current/src/mat/interface/matrix.c.html#line9487
>>>>> [1], just to be sure I added a dummy printf in
>>>>> MatMatMultSymbolic_MPIBAIJ_MPIDense and nothing is displayed with
>>>>> MAT_REUSE_MATRIX, MAT_INITIAL_MATRIX works as intended)
>>>>
>>>> MatMatMultSymbolic_xxx() is called only for MAT_INITIAL_MATRIX,
>>>> during which, it defines
>>>> (*C)->ops->matmultnumeric = MatMatMultNumeric_MPIBAIJ_MPIDense;
>>>
>>> Sorry, I was not thorough enough. What I meant to say is that I never call MatMatMult with MAT_INITIAL_MATRIX, so MatMatMultSymbolic_xxx is never called.
>>
>> It is not designed to work that way. The reuse is specifically to reuse a matrix obtained with a call to initial_matrix, not to reuse a random matrix the user provides. The reason is that the symbolic stage may/does build additional data structures that are used for the multiples that the user cannot see or build directly.
>
> I’m doing:
> MatCreateDense(PETSC_COMM_WORLD, m, n, M, N, in, &B);
> MatCreateDense(PETSC_COMM_WORLD, m, n, M, N, out, &C);
> MatMatMult(A, B, MAT_REUSE_MATRIX, PETSC_DEFAULT, &C);
>
> This works perfectly fine with A of type MPIAIJ, but once again, everything is hardwired for this only type.
Hmm, I modified src/mat/examples/tests/ex109.c to create the dense D matrix with a MatDuplicate() instead of INITIAL MATRIX and when I run in parallel I get
$ petscmpiexec -n 2 ./ex109
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Petsc has generated inconsistent data
[0]PETSC ERROR: Container does not exist
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.7.6, unknown
[0]PETSC ERROR: ./ex109 on a arch-basic named Barrys-MacBook-Pro.local by barrysmith Wed Jun 7 21:24:59 2017
[0]PETSC ERROR: Configure options PETSC_ARCH=arch-basic
[0]PETSC ERROR: #1 MatMPIDenseScatter() line 487 in /Users/barrysmith/Src/petsc/src/mat/impls/aij/mpi/mpimatmatmult.c
[0]PETSC ERROR: #2 MatMatMultNumeric_MPIAIJ_MPIDense() line 557 in /Users/barrysmith/Src/petsc/src/mat/impls/aij/mpi/mpimatmatmult.c
[0]PETSC ERROR: #3 MatMatMult() line 9492 in /Users/barrysmith/Src/petsc/src/mat/interface/matrix.c
[0]PETSC ERROR: #4 main() line 79 in /Users/barrysmith/Src/petsc/src/mat/examples/tests/ex109.c
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -malloc_test
it does run sequentially.
> According to your answer, Barry, I think the manual should be adjusted accordingly because to me what I’m trying to achieve should be possible as is (the fact that it is possible with MPIAIJ is also puzzling):
> "In the special case where matrix B (and hence C) are dense you can create the correctly sized matrix C yourself and then call this routine with MAT_REUSE_MATRIX, rather than first having MatMatMult() create it for you. […]” (http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatMatMult.html)
Yes, you are right this documentation is wrong. This is only true for sequential dense matrices. I've fixed the manual page in master.
Barry
>
>>> Is there another way?
>>> I cannot use MAT_INITIAL_MATRIX because I want to manage the underlying memory, and I'm guessing MAT_INITIAL_MATRIX will delete whatever is in C and reallocate memory on its own.
>>
>> The memory in C is just a dense array so presumably you want to use some "application" memory for this space?
>>
>> The simplest way for you to achieve this is to call the MatMatMultSymbolic() then dig out the underly array of the matrix, call PetscFree() on it and put in your own array. Yes, this is slightly hacky, but it will work fine within the current PETSc model.
>
> Yeah, alright, I already need to tinker with the MatDensePlaceArray/MatDenseResetArray so I can figure something out as well for this.
> Thanks,
> Pierre
>
>> Barry
>>
>>>
>>> Thanks in advance for your help,
>>> Pierre
>>>
>>>> Then MatMatMult(A,C,MAT_REUSE_MATRIX,..) calls
>>>> (*(*C)->ops->matmultnumeric)(A,B,*C); (line 9432 in matrix.c)
>>>>
>>>> which should go to MatMatMultNumeric_MPIBAIJ_MPIDense.
>>>>
>>>> You may follow a debugging process
>>>> using petsc/src/mat/examples/tests/ex109.c
>>>>
>>>> Are you working on a branch of petsc? If so, I may take a look and
>>>> see what is the problem.
>>>>
>>>> Hong
>>>>
>>>>>>> 2) I'm having trouble when scall == MAT_REUSE_MATRIX. Here,
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>> http://www.mcs.anl.gov/petsc/petsc-current/src/mat/impls/dense/mpi/mpidense.c.html#line1208
>>>>>> [2]
>>>>>>
>>>>>>> [2] it looks that the numeric part of the MatMatMult (which is
>>>>>>> called when scall == MAT_REUSE_MATRIX) is hardwired to this
>>>>>>> routine
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>> http://www.mcs.anl.gov/petsc/petsc-current/src/mat/impls/aij/mpi/mpimatmatmult.c.html#line376
>>>>>> [3]
>>>>>> [3].
>>>>>> br>
>>>>>>
>>>>>
>>>> s.anl.gov/petsc/petsc-current/src/mat/impls/aij/mpi/mpimatmatmult.c.html#line556"
>>>>>> rel="noreferrer"
>>>>>>
>>>>>
>>>> target="_blank">http://www.mcs.anl.gov/petsc/petsc-current/src/mat/impls/aij/mpi/mpimatmatmult.c.html#line556
>>>>>> [2]
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>> http://www.mcs.anl.gov/petsc/petsc-current/src/mat/impls/dense/mpi/mpidense.c.html#line1208
>>>>>> [4]
>>>>>> [3]
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>> http://www.mcs.anl.gov/petsc/petsc-current/src/mat/impls/aij/mpi/mpimatmatmult.c.html#line376
>>>>>> [5]
>>>>
>>>>
>>>>
>>>> Links:
>>>> ------
>>>> [1]
>>>> http://www.mcs.anl.gov/petsc/petsc-current/src/mat/interface/matrix.c.html#line9487
>>>> [2]
>>>> http://www.mcs.anl.gov/petsc/petsc-current/src/mat/impls/dense/mpi/mpidense.c.html#line1208
>>>> [3]
>>>> http://www.mcs.anl.gov/petsc/petsc-current/src/mat/impls/aij/mpi/mpimatmatmult.c.html#line376
>>>> [4]
>>>> http://www.mcs.anl.gov/petsc/petsc-current/src/mat/impls/dense/mpi/mpidense.c.html#line1208
>>>> [5]
>>>> http://www.mcs.anl.gov/petsc/petsc-current/src/mat/impls/aij/mpi/mpimatmatmult.c.html#line376
More information about the petsc-dev
mailing list