[petsc-users] PetscSF Object on Distributed DMPlex for Halo Data Exchange
Matthew Knepley
knepley at gmail.com
Sat May 21 09:33:56 CDT 2022
On Fri, May 20, 2022 at 4:45 PM Mike Michell <mi.mike1021 at gmail.com> wrote:
> Thanks for the reply.
>
> > "What I want to do is to exchange data (probably just MPI_Reduce)" which
> confuses me, because halo exchange is a point-to-point exchange and not a
> reduction. Can you clarify?
> PetscSFReduceBegin/End seems to be the function that do reduction for
> PetscSF object. What I intended to mention was either reduction or
> exchange, not specifically intended "reduction".
>
> As a follow-up question:
> Assuming that the code has its own local solution arrays (not Petsc type),
> and if the plex's DAG indices belong to the halo region are the only
> information that I want to know (not the detailed section description, such
> as degree of freedom on vertex, cells, etc.). I have another PetscSection
> for printing out my solution.
> Also if I can convert that DAG indices into my local cell/vertex index,
> can I just use the PetscSF object created from DMGetPointSF(), instead of
> "creating PetscSection + DMGetSectionSF()"? In other words, can I use the
> PetscSF object declared from DMGetPointSF() for the halo communication?
>
No, because that point SF will index information by point number. You would
need to build a new SF that indexes your dofs. The steps you would
go through are exactly the same as you would if you just told us what the
Section is that indexes your data.
Thanks,
Matt
> Thanks,
> Mike
>
>
> The PetscSF that is created automatically is the "point sf" (
>> https://petsc.org/main/docs/manualpages/DM/DMGetPointSF/): it says which
>> mesh points (cells, faces, edges and vertices) are duplicates of others.
>>
>> In a finite volume application we typically want to assign degrees of
>> freedom just to cells: some applications may only have one degree of
>> freedom, others may have multiple.
>>
>> You encode where you want degrees of freedom in a PetscSection and set
>> that as the section for the DM in DMSetLocalSection() (
>> https://petsc.org/release/docs/manualpages/DM/DMSetLocalSection.html)
>>
>> (A c example of these steps that sets degrees of freedom for *vertices*
>> instead of cells is `src/dm/impls/plex/tutorials/ex7.c`)
>>
>> After that you can call DMGetSectionSF() (
>> https://petsc.org/main/docs/manualpages/DM/DMGetSectionSF/) to the the
>> PetscSF that you want for halo exchange: the one for your solution
>> variables.
>>
>> After that, the only calls you typically need in a finite volume code is
>> PetscSFBcastBegin() to start a halo exchange and PetscSFBcastEnd() to
>> complete it.
>>
>> You say
>>
>> > What I want to do is to exchange data (probably just MPI_Reduce)
>>
>> which confuses me, because halo exchange is a point-to-point exchange and
>> not a reduction. Can you clarify?
>>
>>
>>
>> On Fri, May 20, 2022 at 8:35 PM Mike Michell <mi.mike1021 at gmail.com>
>> wrote:
>>
>>> Dear PETSc developer team,
>>>
>>> Hi, I am using DMPlex for a finite-volume code and trying to understand
>>> the usage of PetscSF. What is a typical procedure for doing halo data
>>> exchange at parallel boundary using PetscSF object on DMPlex? Is there any
>>> example that I can refer to usage of PetscSF with distributed DMPlex?
>>>
>>> Assuming to use the attached mock-up code and mesh, if I give
>>> "-dm_distribute_overlap 1 -over_dm_view" to run the code, I can see a
>>> PetscSF object is already created, although I have not called
>>> "PetscSFCreate" in the code. How can I import & use that PetscSF already
>>> created by the code to do the halo data exchange?
>>>
>>> What I want to do is to exchange data (probably just MPI_Reduce) in a
>>> parallel boundary region using PetscSF and its functions. I might need to
>>> have an overlapping layer or not.
>>>
>>> Thanks,
>>> Mike
>>>
>>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220521/7594f941/attachment-0001.html>
More information about the petsc-users
mailing list