[petsc-users] A bad commit affects MOOSE

Stefano Zampini stefano.zampini at gmail.com
Tue Apr 3 16:04:48 CDT 2018


What about

PetscCommGetPkgComm(MPI_Comm comm ,const char* package, MPI_Comm* pkgcomm)

with a key for each of the external packages PETSc can use?


> On Apr 3, 2018, at 10:56 PM, Kong, Fande <fande.kong at inl.gov> wrote:
> 
> I think we could add an inner comm for external package. If the same comm is passed in again, we just retrieve the same communicator, instead of MPI_Comm_dup(), for that external package (at least HYPRE team claimed this will be fine).   I did not see any issue with this idea so far. 
> 
> I might be missing something here 
> 
> 
> Fande,
> 
> On Tue, Apr 3, 2018 at 1:45 PM, Satish Balay <balay at mcs.anl.gov <mailto:balay at mcs.anl.gov>> wrote:
> On Tue, 3 Apr 2018, Smith, Barry F. wrote:
> 
> >
> >
> > > On Apr 3, 2018, at 11:59 AM, Balay, Satish <balay at mcs.anl.gov <mailto:balay at mcs.anl.gov>> wrote:
> > >
> > > On Tue, 3 Apr 2018, Smith, Barry F. wrote:
> > >
> > >>   Note that PETSc does one MPI_Comm_dup() for each hypre matrix. Internally hypre does at least one MPI_Comm_create() per hypre boomerAMG solver. So even if PETSc does not do the MPI_Comm_dup() you will still be limited due to hypre's MPI_Comm_create.
> > >>
> > >>    I will compose an email to hypre cc:ing everyone to get information from them.
> > >
> > > Actually I don't see any calls to MPI_Comm_dup() in hypre sources [there are stubs for it for non-mpi build]
> > >
> > > There was that call to MPI_Comm_create() in the stack trace [via hypre_BoomerAMGSetup]
> >
> >    This is what I said. The MPI_Comm_create() is called for each solver and hence uses a slot for each solver.
> 
> Ops sorry - misread the text..
> 
> Satish
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180403/b9d57935/attachment-0001.html>


More information about the petsc-users mailing list