[petsc-users] PetscSF Object on Distributed DMPlex for Halo Data Exchange

Toby Isaac toby.isaac at gmail.com
Sun May 22 13:56:29 CDT 2022


Here's a diagram of a 1D mesh with overlap and 3 partitions, showing what
the petscsf data is for each.  The number of roots is the number of mesh
points in the local representation, and the number of leaves is the number
of mesh points that are duplicates of mesh points on other processes.  With
that in mind, answering your questions

> (1) It seems that the "roots" means the number of vertex not considering
overlap layer, and "leaves" seems the number of distributed vertex for each
processor that includes overlap layer. Can you acknowledge that this is
correct understanding? I have tried to find clearer examples from PETSc
team's articles relevant to Star Forest, but I am still unclear about the
exact relation & graphical notation of roots & leaves in SF if it's the
case of DMPlex solution arrays.

No, the number of roots for a DMPlex is the number of mesh points in the
local portion of the mesh

> (2) If it is so, there is an issue that I cannot define "root data" and
"leave data" generally. I am trying to following
"src/vec/is/sf/tutorials/ex1f.F90", however, in that example, size of roots
and leaves are predefined as 6. How can I generalize that? Because I can
get size of leaves using DAG depth(or height), which is equal to number of
vertices each proc has. But, how can I get the size of my "roots" region
from SF? Any example about that? This question is connected to how can I
define "rootdata" for "PetscSFBcastBegin/End()".

Does the diagram help you generalize?

> (3) More importantly, with the attached PetscSection & SF layout, my
vector is only resolved for the size equal to "number of roots" for each
proc, but not for the overlapping area(i.e., "leaves"). What I wish to do
is to exchange (or reduce) the solution data between each proc, in the
overlapping region. Can I get some advices why my vector does not encompass
the "leaves" regime? Is there any example doing similar things?
Going back to my first response: if you use a section to say how many
pieces of data are associated with each local mesh point, then a PetscSF is
constructed that requires no more manipulation from you.


On Sun, May 22, 2022 at 10:47 AM Mike Michell <mi.mike1021 at gmail.com> wrote:

> Thank you for the reply.
> The PetscSection and PetscSF objects are defined as in the attached
> mock-up code (Q_PetscSF_1.tar). 1-DOF is defined on vertex as my solution
> is determined on each vertex with 1-DOF from a finite-volume method.
>
> As follow up questions:
> (1) It seems that the "roots" means the number of vertex not considering
> overlap layer, and "leaves" seems the number of distributed vertex for each
> processor that includes overlap layer. Can you acknowledge that this is
> correct understanding? I have tried to find clearer examples from PETSc
> team's articles relevant to Star Forest, but I am still unclear about the
> exact relation & graphical notation of roots & leaves in SF if it's the
> case of DMPlex solution arrays.
>
> (2) If it is so, there is an issue that I cannot define "root data" and
> "leave data" generally. I am trying to following
> "src/vec/is/sf/tutorials/ex1f.F90", however, in that example, size of roots
> and leaves are predefined as 6. How can I generalize that? Because I can
> get size of leaves using DAG depth(or height), which is equal to number of
> vertices each proc has. But, how can I get the size of my "roots" region
> from SF? Any example about that? This question is connected to how can I
> define "rootdata" for "PetscSFBcastBegin/End()".
>
> (3) More importantly, with the attached PetscSection & SF layout, my
> vector is only resolved for the size equal to "number of roots" for each
> proc, but not for the overlapping area(i.e., "leaves"). What I wish to do
> is to exchange (or reduce) the solution data between each proc, in the
> overlapping region. Can I get some advices why my vector does not encompass
> the "leaves" regime? Is there any example doing similar things?
>
> Thanks,
> Mike
>
>
>> On Fri, May 20, 2022 at 4:45 PM Mike Michell <mi.mike1021 at gmail.com>
>> wrote:
>>
>>> Thanks for the reply.
>>>
>>> > "What I want to do is to exchange data (probably just MPI_Reduce)"
>>> which confuses me, because halo exchange is a point-to-point exchange and
>>> not a reduction.  Can you clarify?
>>> PetscSFReduceBegin/End seems to be the function that do reduction for
>>> PetscSF object. What I intended to mention was either reduction or
>>> exchange, not specifically intended "reduction".
>>>
>>> As a follow-up question:
>>> Assuming that the code has its own local solution arrays (not Petsc
>>> type), and if the plex's DAG indices belong to the halo region are the only
>>> information that I want to know (not the detailed section description, such
>>> as degree of freedom on vertex, cells, etc.). I have another PetscSection
>>> for printing out my solution.
>>> Also if I can convert that DAG indices into my local cell/vertex index,
>>> can I just use the PetscSF object created from DMGetPointSF(), instead of
>>> "creating PetscSection + DMGetSectionSF()"? In other words, can I use the
>>> PetscSF object declared from DMGetPointSF() for the halo communication?
>>>
>>
>> No, because that point SF will index information by point number. You
>> would need to build a new SF that indexes your dofs. The steps you would
>> go through are exactly the same as you would if you just told us what the
>> Section is that indexes your data.
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>>> Thanks,
>>> Mike
>>>
>>>
>>> The PetscSF that is created automatically is the "point sf" (
>>>> https://petsc.org/main/docs/manualpages/DM/DMGetPointSF/): it says
>>>> which mesh points (cells, faces, edges and vertices) are duplicates of
>>>> others.
>>>>
>>>> In a finite volume application we typically want to assign degrees of
>>>> freedom just to cells: some applications may only have one degree of
>>>> freedom, others may have multiple.
>>>>
>>>> You encode where you want degrees of freedom in a PetscSection and set
>>>> that as the section for the DM in DMSetLocalSection() (
>>>> https://petsc.org/release/docs/manualpages/DM/DMSetLocalSection.html)
>>>>
>>>> (A c example of these steps that sets degrees of freedom for *vertices*
>>>> instead of cells is `src/dm/impls/plex/tutorials/ex7.c`)
>>>>
>>>> After that you can call DMGetSectionSF() (
>>>> https://petsc.org/main/docs/manualpages/DM/DMGetSectionSF/) to the the
>>>> PetscSF that you want for halo exchange: the one for your solution
>>>> variables.
>>>>
>>>> After that, the only calls you typically need in a finite volume code
>>>> is PetscSFBcastBegin() to start a halo exchange and PetscSFBcastEnd() to
>>>> complete it.
>>>>
>>>> You say
>>>>
>>>> > What I want to do is to exchange data (probably just MPI_Reduce)
>>>>
>>>> which confuses me, because halo exchange is a point-to-point exchange
>>>> and not a reduction.  Can you clarify?
>>>>
>>>>
>>>>
>>>> On Fri, May 20, 2022 at 8:35 PM Mike Michell <mi.mike1021 at gmail.com>
>>>> wrote:
>>>>
>>>>> Dear PETSc developer team,
>>>>>
>>>>> Hi, I am using DMPlex for a finite-volume code and trying to
>>>>> understand the usage of PetscSF. What is a typical procedure for doing halo
>>>>> data exchange at parallel boundary using PetscSF object on DMPlex? Is there
>>>>> any example that I can refer to usage of PetscSF with distributed DMPlex?
>>>>>
>>>>> Assuming to use the attached mock-up code and mesh, if I give
>>>>> "-dm_distribute_overlap 1 -over_dm_view" to run the code, I can see a
>>>>> PetscSF object is already created, although I have not called
>>>>> "PetscSFCreate" in the code. How can I import & use that PetscSF already
>>>>> created by the code to do the halo data exchange?
>>>>>
>>>>> What I want to do is to exchange data (probably just MPI_Reduce) in a
>>>>> parallel boundary region using PetscSF and its functions. I might need to
>>>>> have an overlapping layer or not.
>>>>>
>>>>> Thanks,
>>>>> Mike
>>>>>
>>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220522/5493d208/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: overlap-svg-diagram.png
Type: image/png
Size: 197226 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220522/5493d208/attachment-0001.png>


More information about the petsc-users mailing list