[petsc-users] PetscSFReduceBegin can not handle MPI_CHAR?
Zhang, Junchao
jczhang at mcs.anl.gov
Thu Apr 4 17:34:57 CDT 2019
I updated the branch and made a PR. I tried to do MPI_SUM on MPI_CHAR. We do not have UnpackAdd on this type (we are right). But unfortunately, MPICH's MPI_Reduce_local did not report errors (it should) so we did not generate an error either.
--Junchao Zhang
On Thu, Apr 4, 2019 at 10:37 AM Jed Brown <jed at jedbrown.org<mailto:jed at jedbrown.org>> wrote:
Fande Kong via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> writes:
> Hi Jed,
>
> One more question. Is it fine to use the same SF to exchange two groups of
> data at the same time? What is the better way to do this
This should work due to the non-overtaking property defined by MPI.
> Fande Kong,
>
> ierr =
> PetscSFReduceBegin(ptap->sf,MPIU_INT,rmtspace,space,MPIU_REPLACE);CHKERRQ(ierr);
> ierr =
> PetscSFReduceBegin(ptap->sf,MPI_CHAR,rmtspace2,space2,MPIU_REPLACE);CHKERRQ(ierr);
> Doing some calculations
> ierr =
> PetscSFReduceEnd(ptap->sf,MPIU_INT,rmtspace,space,MPIU_REPLACE);CHKERRQ(ierr);
> ierr =
> PetscSFReduceEnd(ptap->sf,MPI_CHAR,rmtspace2,space2,MPIU_REPLACE);CHKERRQ(ierr);
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190404/2a7b246d/attachment.html>
More information about the petsc-users
mailing list