[petsc-users] counter->tag = *maxval - 128
Fande Kong
fdkong.jd at gmail.com
Wed Jan 13 12:04:58 CST 2021
On Tue, Jan 12, 2021 at 6:49 PM Barry Smith <bsmith at petsc.dev> wrote:
>
> Fande,
>
> /* hope that any still active tags were issued right at the beginning
> of the run */
>
> PETSc actually starts with *maxval (see line 130). It is only when it
> runs out that it does this silly thing for the reason indicated in the
> comment.
>
> PETSc should actually keep track which of which tags have been
> "returned" and if the counter gets to zero use those returned tags instead
> of starting again at the top which could clash with the same value used for
> another reason. In other words the current code is buggy, but it has
> always been "good enough".
>
I agreed that it is "good enough" for most people for most cases. However,
I was worried that there is almost no way to debug once we reuse active
tags. At least it is not easy to debug.
Thanks,
Fande
>
> Barry
>
>
>
> On Jan 12, 2021, at 10:41 AM, Fande Kong <fdkong.jd at gmail.com> wrote:
>
> Hi All,
>
> I am curious about why we subtract 128 from the max value of tag? Can we
> directly use the max tag value?
>
> Thanks,
>
> Fande,
>
>
> PetscErrorCode PetscCommGetNewTag(MPI_Comm comm,PetscMPIInt *tag)
> {
> PetscErrorCode ierr;
> PetscCommCounter *counter;
> PetscMPIInt *maxval,flg;
>
>
> MPI_Comm_get_attr(comm,Petsc_Counter_keyval,&counter,&flg);
> if (!flg) SETERRQ(PETSC_COMM_SELF,PETSC_ERR_ARG_CORRUPT,"Bad MPI
> communicator supplied; must be a PETSc communicator");
>
> if (counter->tag < 1) {
> PetscInfo1(NULL,"Out of tags for object, starting to recycle. Comm
> reference count %d\n",counter->refcount);
> MPI_Comm_get_attr(MPI_COMM_WORLD,MPI_TAG_UB,&maxval,&flg);
> if (!flg) SETERRQ(PETSC_COMM_SELF,PETSC_ERR_LIB,"MPI error:
> MPI_Comm_get_attr() is not returning a MPI_TAG_UB");
> counter->tag = *maxval - 128; /* hope that any still active tags were
> issued right at the beginning of the run */
> }
>
> *tag = counter->tag--;
> if (PetscDefined(USE_DEBUG)) {
> /*
> Hanging here means that some processes have called
> PetscCommGetNewTag() and others have not.
> */
> MPI_Barrier(comm);
> }
> return(0);
> }
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210113/4d472f20/attachment.html>
More information about the petsc-users
mailing list