[petsc-users] MatAssemblyEnd hangs during a parallel calculation with PETSc>3.3
Projet_TRIOU
triou at cea.fr
Thu Jan 16 04:11:56 CST 2014
Sorry a typo, it is really MatMPISBAIJSetPreallocation()
Yes, I call MatMPISBAIJSetPreallocation() on ALL processes and sometimes
a local part of the matrix has zero rows. It worked well with Petsc 3.3
and before
in this particular case.
Sorry, it will difficult to me to send you the whole code to reproduce
the problem.
May be we could reproduce the hang with an example test of PETSc ?
Pierre
> Hmm, sometimes you talk about MatMPIAIJSetPreallocation() and sometimes MatMPISBAIJSetPreallocation() which one is it?
>
> If a process has zero rows that should not affect the nonew variable.
>
> It is crucial the the SetPreallocation routines be called on ALL processes that share the matrix, even if they have zero rows, but it shouldn’t matter what the values are for a matrix with zero rows.
>
> Can you send use your test case so we can track down the problem?
>
> Barry
>
>
>
> On Jan 15, 2014, at 5:57 AM, Projet_TRIOU <triou at cea.fr> wrote:
>
>> Hi all,
>>
>> I switched from PETSc 3.3 to PETSc 3.4.3 and all was fine except for
>> one of my test case on 3 processors where one processor was
>> dealing an empty local part of the global matrix.
>>
>> My code hangs just during the call at MatAssemblyEnd:
>>
>> ierr = MatMPIAIJSetPreallocation(MatricePetsc, PETSC_DEFAULT, d_nnz.addr(), PETSC_DEFAULT, o_nnz.addr());
>> ...
>> ierr = MatAssemblyEnd(MatricePetsc, MAT_FINAL_ASSEMBLY);
>>
>> When I debugged, I notice on the empty processor, that in
>> src/mat/impls/aij/mpi/mpiaij.c:
>>
>> if (!((Mat_SeqAIJ*)aij->B->data)->nonew) {
>> ierr = MPI_Allreduce(&mat->was_assembled,&other_disassembled,1,MPIU_BOOL,MPI_PROD,PetscObjectComm((PetscObject)mat));CHKERRQ(ierr);
>> if (mat->was_assembled && !other_disassembled) {
>> ierr = MatDisAssemble_MPIAIJ(mat);CHKERRQ(ierr);
>> }
>>
>> ((Mat_SeqAIJ*)aij->B->data)->nonew was 0 on the "empty" processor
>> and -2 on the 2 others...
>>
>> I bypassed my problem with a different call to MatMPIAIJSetPreallocation():
>>
>> if (nb_rows==0) // Local matrix is empty
>> ierr = MatMPISBAIJSetPreallocation(MatricePetsc, block_size_, 0, NULL, 0, NULL);
>> else
>> ierr = MatMPISBAIJSetPreallocation(MatricePetsc, block_size_, PETSC_DEFAULT, d_nnz.addr(), PETSC_DEFAULT, o_nnz.addr());
>>
>> Now, it runs well. So I don't know if it is a PETSc regression or if I was abusively
>> calling MatMPISBAIJSetPreallocation with d_nnz/o_nnz empty arrays....
>>
>> Thanks,
>>
>> Pierre
>>
--
*Trio_U support team*
Marthe ROUX (Saclay)
Pierre LEDAC (Grenoble)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140116/c94e33f6/attachment.html>
More information about the petsc-users
mailing list