[petsc-users] Does petsc duplicate the users communicator?

Junchao Zhang junchao.zhang at gmail.com
Fri Jul 9 12:57:09 CDT 2021


On Fri, Jul 9, 2021 at 12:13 PM Kozdon, Jeremy (CIV) <jekozdon at nps.edu>
wrote:

> This is all super helpful! Thanks.
>
> It seems to me that we do not need to carry around a reference to the
> communicator in Julia then.
>
> Mainly I wanted to use `PetscObjectGetComm` everywhere once `PetscObjects`
> were created, but someone pointed out this might run in to problems with
> the garbage collector.
>
> In my mind it makes sense to rely on `PetscObjectGetComm` since you don’t
> know if this is some object that matches the object you originally created
> or some derived object with a different processor distribution (such as
> what I believe happens with multigrid).
>
I think you are right.

>
> > On Jul 9, 2021, at 8:52 AM, Junchao Zhang <junchao.zhang at gmail.com>
> wrote:
> >
> >
> > NPS WARNING: *external sender* verify before acting.
> >
> >
> >
> >
> >
> > On Thu, Jul 8, 2021 at 11:21 PM Barry Smith <bsmith at petsc.dev> wrote:
> >
> >    Whenever PETSc is handed a communicator it looks for an attribute
> inside of the communicator that contains the "PETSc" version of that
> communicator. If it does not find the attribute it adds an attribute with a
> new communicator in, if it does find one it increases its reference count
> by one. The routine it uses to perform this is PetscCommDuplicate(). We do
> it this was so that PETSc communication will never potentially interfere
> with the users use of their communicators. PetscCommDestroy() decreases the
> reference count of the inner communicator by one. So, for example, if you
> use "comm" to create two PETSc objects, PETSc will create an attribute on
> "comm" with a new communicator, when both objects are destroy then
> PetscCommDestroy() will have been called twice and the inner (PETSc)
> communicator will be destroyed.
> >
> >   If someone did
> >
> >       Use MPI to create a new communicator
> >       VecCreate(comm,...)
> >       Use MPI to destroy the new communicator
> >       ....
> >       VecDestroy()
> > The code above will work correctly.  In 'Use MPI to destroy the new
> communicator',  MPI finds out comm has an attribute Petsc_InnerComm_keyval,
> so it invokes a petsc function Petsc_InnerComm_Attr_Delete_Fn (which was
> given to MPI at PetscInitialize).
> > In Petsc_InnerComm_Attr_Delete_Fn, it cuts the link between comm and its
> inner petsc comm (which is still used by vec in this example). The inner
> petsc comm is still valid and accessible via PetscObjectComm(). It will be
> destroyed when its reference count (managed by petsc) reaches zero
> (probably in VecDestroy).
> >
> >
> > I am not sure what will happen since PETSc keeps a reference to the
> outer communicator from its own inner communicator. And destroying the user
> communicator will cause an attempt to destroy the attribute containing the
> inner PETSc communicator.  I had always just assumed the user would not be
> deleting any MPI communicators they made and pass to PETSc until they were
> done with PETSc. It may work correctly but may not.
> >
> > The reality is very few MPI codes have complicated life cycles for MPI
> communicators.
> >
> > Barry
> >
> >
> > > On Jul 8, 2021, at 10:17 PM, Kozdon, Jeremy (CIV) <jekozdon at nps.edu>
> wrote:
> > >
> > > Sorry if this is clearly stated somewhere in the docs, I'm still
> getting familiar with the petsc codebase and was also unable to find the
> answer searching (nor could I determine where this would be done in the
> source).
> > >
> > > Does petsc duplicate MPI communicators? Or does the users program need
> to make sure that the communicator remains valid for the life of a petsc
> object?
> > >
> > > The attached little test code seems to suggest that there is some
> duplication of MPI communicators behind the scenes.
> > >
> > > This came up when working on Julia wrappers for petsc. (Julia has a
> garbage collector so we need to make sure that references are properly kept
> if needed.)
> > >
> > > <try.c>
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210709/91c54362/attachment.html>


More information about the petsc-users mailing list