[petsc-users] PetscSF Object on Distributed DMPlex for Halo Data Exchange
Toby Isaac
toby.isaac at gmail.com
Fri May 20 20:07:46 CDT 2022
The PetscSF that is created automatically is the "point sf" (
https://petsc.org/main/docs/manualpages/DM/DMGetPointSF/): it says which
mesh points (cells, faces, edges and vertices) are duplicates of others.
In a finite volume application we typically want to assign degrees of
freedom just to cells: some applications may only have one degree of
freedom, others may have multiple.
You encode where you want degrees of freedom in a PetscSection and set that
as the section for the DM in DMSetLocalSection() (
https://petsc.org/release/docs/manualpages/DM/DMSetLocalSection.html)
(A c example of these steps that sets degrees of freedom for *vertices*
instead of cells is `src/dm/impls/plex/tutorials/ex7.c`)
After that you can call DMGetSectionSF() (
https://petsc.org/main/docs/manualpages/DM/DMGetSectionSF/) to the the
PetscSF that you want for halo exchange: the one for your solution
variables.
After that, the only calls you typically need in a finite volume code is
PetscSFBcastBegin() to start a halo exchange and PetscSFBcastEnd() to
complete it.
You say
> What I want to do is to exchange data (probably just MPI_Reduce)
which confuses me, because halo exchange is a point-to-point exchange and
not a reduction. Can you clarify?
On Fri, May 20, 2022 at 8:35 PM Mike Michell <mi.mike1021 at gmail.com> wrote:
> Dear PETSc developer team,
>
> Hi, I am using DMPlex for a finite-volume code and trying to understand
> the usage of PetscSF. What is a typical procedure for doing halo data
> exchange at parallel boundary using PetscSF object on DMPlex? Is there any
> example that I can refer to usage of PetscSF with distributed DMPlex?
>
> Assuming to use the attached mock-up code and mesh, if I give
> "-dm_distribute_overlap 1 -over_dm_view" to run the code, I can see a
> PetscSF object is already created, although I have not called
> "PetscSFCreate" in the code. How can I import & use that PetscSF already
> created by the code to do the halo data exchange?
>
> What I want to do is to exchange data (probably just MPI_Reduce) in a
> parallel boundary region using PetscSF and its functions. I might need to
> have an overlapping layer or not.
>
> Thanks,
> Mike
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220520/6d264e45/attachment.html>
More information about the petsc-users
mailing list