[petsc-users] PetscSF Object on Distributed DMPlex for Halo Data Exchange

Mike Michell mi.mike1021 at gmail.com
Fri May 20 19:35:27 CDT 2022

Dear PETSc developer team,

Hi, I am using DMPlex for a finite-volume code and trying to understand the
usage of PetscSF. What is a typical procedure for doing halo data exchange
at parallel boundary using PetscSF object on DMPlex? Is there any example
that I can refer to usage of PetscSF with distributed DMPlex?

Assuming to use the attached mock-up code and mesh, if I give
"-dm_distribute_overlap 1 -over_dm_view" to run the code, I can see a
PetscSF object is already created, although I have not called
"PetscSFCreate" in the code. How can I import & use that PetscSF already
created by the code to do the halo data exchange?

What I want to do is to exchange data (probably just MPI_Reduce) in a
parallel boundary region using PetscSF and its functions. I might need to
have an overlapping layer or not.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220520/eb28507d/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Q_PetscSF.tar
Type: application/x-tar
Size: 133120 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220520/eb28507d/attachment-0001.tar>

More information about the petsc-users mailing list