[petsc-users] MatAssemblyEnd hangs during a parallel calculation with PETSc>3.3
Matthew Knepley
knepley at gmail.com
Wed Jan 15 11:09:58 CST 2014
On Wed, Jan 15, 2014 at 5:57 AM, Projet_TRIOU <triou at cea.fr> wrote:
> Hi all,
>
> I switched from PETSc 3.3 to PETSc 3.4.3 and all was fine except for
> one of my test case on 3 processors where one processor was
> dealing an empty local part of the global matrix.
>
> My code hangs just during the call at MatAssemblyEnd:
>
> ierr = MatMPIAIJSetPreallocation(MatricePetsc, PETSC_DEFAULT,
> d_nnz.addr(), PETSC_DEFAULT, o_nnz.addr());
> ...
> ierr = MatAssemblyEnd(MatricePetsc, MAT_FINAL_ASSEMBLY);
>
> When I debugged, I notice on the empty processor, that in
> src/mat/impls/aij/mpi/mpiaij.c:
>
> if (!((Mat_SeqAIJ*)aij->B->data)->nonew) {
> ierr = MPI_Allreduce(&mat->was_assembled,&other_disassembled,
> 1,MPIU_BOOL,MPI_PROD,PetscObjectComm((PetscObject)mat));CHKERRQ(ierr);
> if (mat->was_assembled && !other_disassembled) {
> ierr = MatDisAssemble_MPIAIJ(mat);CHKERRQ(ierr);
> }
>
Good call. This is definitely a bug. I will get it fixed.
Matt
> ((Mat_SeqAIJ*)aij->B->data)->nonew was 0 on the "empty" processor
> and -2 on the 2 others...
>
> I bypassed my problem with a different call to MatMPIAIJSetPreallocation():
>
> if (nb_rows==0) // Local matrix is empty
> ierr = MatMPISBAIJSetPreallocation(MatricePetsc, block_size_, 0,
> NULL, 0, NULL);
> else
> ierr = MatMPISBAIJSetPreallocation(MatricePetsc, block_size_,
> PETSC_DEFAULT, d_nnz.addr(), PETSC_DEFAULT, o_nnz.addr());
>
> Now, it runs well. So I don't know if it is a PETSc regression or if I was
> abusively
> calling MatMPISBAIJSetPreallocation with d_nnz/o_nnz empty arrays....
>
> Thanks,
>
> Pierre
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140115/0f1f1625/attachment-0001.html>
More information about the petsc-users
mailing list