[petsc-users] MatAssemblyEnd hangs during a parallel calculation with PETSc>3.3

Matthew Knepley knepley at gmail.com
Wed Jan 15 14:31:49 CST 2014


On Wed, Jan 15, 2014 at 2:28 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
>   Hmm, sometimes you talk about MatMPIAIJSetPreallocation() and sometimes
> MatMPISBAIJSetPreallocation() which one is it?
>
>   If a process has zero rows that should not affect the nonew variable.
>
>    It is crucial the the SetPreallocation routines be called on ALL
> processes that share the matrix, even if they have zero rows, but it
> shouldn’t matter what the values are for a matrix with zero rows.
>
>    Can you send use your test case so we can track down the problem?


I sent a mail to petsc-dev explaining the problem.

   Matt


>
>     Barry
>
>
>
> On Jan 15, 2014, at 5:57 AM, Projet_TRIOU <triou at cea.fr> wrote:
>
> > Hi all,
> >
> > I switched from PETSc 3.3 to PETSc 3.4.3 and all was fine except for
> > one of my test case on 3 processors where one processor was
> > dealing an empty local part of the global matrix.
> >
> > My code hangs just during the call at MatAssemblyEnd:
> >
> > ierr = MatMPIAIJSetPreallocation(MatricePetsc, PETSC_DEFAULT,
> d_nnz.addr(), PETSC_DEFAULT, o_nnz.addr());
> > ...
> > ierr = MatAssemblyEnd(MatricePetsc, MAT_FINAL_ASSEMBLY);
> >
> > When I debugged, I notice on the empty processor, that in
> > src/mat/impls/aij/mpi/mpiaij.c:
> >
> >  if (!((Mat_SeqAIJ*)aij->B->data)->nonew) {
> >    ierr =
> MPI_Allreduce(&mat->was_assembled,&other_disassembled,1,MPIU_BOOL,MPI_PROD,PetscObjectComm((PetscObject)mat));CHKERRQ(ierr);
> >    if (mat->was_assembled && !other_disassembled) {
> >      ierr = MatDisAssemble_MPIAIJ(mat);CHKERRQ(ierr);
> >    }
> >
> > ((Mat_SeqAIJ*)aij->B->data)->nonew was 0 on the "empty" processor
> > and -2 on the 2 others...
> >
> > I bypassed my problem with a different call to
> MatMPIAIJSetPreallocation():
> >
> >      if (nb_rows==0) // Local matrix is empty
> >         ierr = MatMPISBAIJSetPreallocation(MatricePetsc, block_size_, 0,
> NULL, 0, NULL);
> >      else
> >         ierr = MatMPISBAIJSetPreallocation(MatricePetsc, block_size_,
> PETSC_DEFAULT, d_nnz.addr(), PETSC_DEFAULT, o_nnz.addr());
> >
> > Now, it runs well. So I don't know if it is a PETSc regression or if I
> was abusively
> > calling MatMPISBAIJSetPreallocation with d_nnz/o_nnz empty arrays....
> >
> > Thanks,
> >
> > Pierre
> >
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140115/2d8e0c35/attachment.html>


More information about the petsc-users mailing list