[petsc-users] PetscSF Object on Distributed DMPlex for Halo Data Exchange

Matthew Knepley knepley at gmail.com
Tue May 31 09:06:01 CDT 2022


On Tue, May 31, 2022 at 10:04 AM Mike Michell <mi.mike1021 at gmail.com> wrote:

> As a follow-up question on your example, is it possible to call
> PetscSFCreateRemoteOffsets() from Fortran?
>
> My code is written in .F90 and in "petsc/finclude/" there is no petscsf.h
> so that the code currently cannot find PetscSFCreateRemoteOffsets().
>

I believe if you pass in NULL for remoteOffsets, that function will be
called internally.

  Thanks,

     Matt


> Thanks,
> Mike
>
> 2022년 5월 24일 (화) 오후 8:46, Matthew Knepley <knepley at gmail.com>님이 작성:
>
>> I will also point out that Toby has created a nice example showing how to
>> create an SF for halo exchange between local vectors.
>>
>>   https://gitlab.com/petsc/petsc/-/merge_requests/5267
>>
>>   Thanks,
>>
>>      Matt
>>
>> On Sun, May 22, 2022 at 9:47 PM Matthew Knepley <knepley at gmail.com>
>> wrote:
>>
>>> On Sun, May 22, 2022 at 4:28 PM Mike Michell <mi.mike1021 at gmail.com>
>>> wrote:
>>>
>>>> Thanks for the reply. The diagram makes sense and is helpful for
>>>> understanding 1D representation.
>>>>
>>>> However, something is still unclear. From your diagram, the number of
>>>> roots per process seems to vary according to run arguments, such as
>>>> "-dm_distribute_overlap", because "the number of roots for a DMPlex is the
>>>> number of mesh points in the local portion of the mesh (cited from your
>>>> answer to my question (1))" will end up change according to that argument.
>>>> However, from my mock-up code, number of roots is independent to
>>>> -dm_distribute_overlap argument. The summation of "number of roots" through
>>>> processes was always equal to number of physical vertex on my mesh, if I
>>>> define the section layout on vertex with 1DOF. But in your diagram example,
>>>> the summation of "nroots" is larger than the actual number of mesh points,
>>>> which is 13.
>>>>
>>>
>>> I do not understand your question. Notice the -dm_distribute_overlap
>>> does _not_ change the owned points for any process. It only puts in new
>>> leaves, so it also never
>>> changes the roots for this way of using the SF.
>>>
>>>
>>>> Also, it is still unclear how to get the size of "roots" from the
>>>> PetscSection & PetscSF on distributed DMPlex?
>>>>
>>>
>>> For an SF mapping ghost dofs in a global vector, the number of roots is
>>> just the size of the local portion of the vector.
>>>
>>>
>>>> In your diagram, how can you tell your code and make it allocate the
>>>> "nroots=7 for P0, nroots=9 for P1, and nroots=7 for P2" arrays before you
>>>> call PetscSFBcastBegin/End()? It seems that we need to define arrays having
>>>> the size of nroots & nleaves before calling PetscSFBcastBegin/End().
>>>>
>>>
>>> I just want to note that this usage is different from the canonical
>>> usage in Plex. It is fine to do this, but this will not match what I do in
>>> the library if you look.
>>> In Plex, I distinguish two linear spaces:
>>>
>>>   1) Global space: This is the vector space for the solvers. Each point
>>> is uniquely represented and owned by some process
>>>
>>>   2) Local space: This is the vector space for assembly. Some points are
>>> represented multiple times.
>>>
>>> I create an SF that maps from the global space (roots) to the local
>>> space (leaves), and it is called in DMGlobalToLocal() (and
>>> associated functions). This
>>> is more natural in FEM. You seem to want an SF that maps between global
>>> vectors. This will also work. The roots would be the local dofs, and the
>>> leaves
>>> would be shared dofs.
>>>
>>>   Does this make sense?
>>>
>>>      Thanks,
>>>
>>>        Matt
>>>
>>>
>>>> Thanks,
>>>> Mike
>>>>
>>>> Here's a diagram of a 1D mesh with overlap and 3 partitions, showing
>>>>> what the petscsf data is for each.  The number of roots is the number of
>>>>> mesh points in the local representation, and the number of leaves is the
>>>>> number of mesh points that are duplicates of mesh points on other
>>>>> processes.  With that in mind, answering your questions
>>>>>
>>>>> > (1) It seems that the "roots" means the number of vertex not
>>>>> considering overlap layer, and "leaves" seems the number of distributed
>>>>> vertex for each processor that includes overlap layer. Can you acknowledge
>>>>> that this is correct understanding? I have tried to find clearer examples
>>>>> from PETSc team's articles relevant to Star Forest, but I am still unclear
>>>>> about the exact relation & graphical notation of roots & leaves in SF if
>>>>> it's the case of DMPlex solution arrays.
>>>>>
>>>>> No, the number of roots for a DMPlex is the number of mesh points in
>>>>> the local portion of the mesh
>>>>>
>>>>> > (2) If it is so, there is an issue that I cannot define "root data"
>>>>> and "leave data" generally. I am trying to following
>>>>> "src/vec/is/sf/tutorials/ex1f.F90", however, in that example, size of roots
>>>>> and leaves are predefined as 6. How can I generalize that? Because I can
>>>>> get size of leaves using DAG depth(or height), which is equal to number of
>>>>> vertices each proc has. But, how can I get the size of my "roots" region
>>>>> from SF? Any example about that? This question is connected to how can I
>>>>> define "rootdata" for "PetscSFBcastBegin/End()".
>>>>>
>>>>> Does the diagram help you generalize?
>>>>>
>>>>> > (3) More importantly, with the attached PetscSection & SF layout, my
>>>>> vector is only resolved for the size equal to "number of roots" for each
>>>>> proc, but not for the overlapping area(i.e., "leaves"). What I wish to do
>>>>> is to exchange (or reduce) the solution data between each proc, in the
>>>>> overlapping region. Can I get some advices why my vector does not encompass
>>>>> the "leaves" regime? Is there any example doing similar things?
>>>>> Going back to my first response: if you use a section to say how many
>>>>> pieces of data are associated with each local mesh point, then a PetscSF is
>>>>> constructed that requires no more manipulation from you.
>>>>>
>>>>>
>>>>> On Sun, May 22, 2022 at 10:47 AM Mike Michell <mi.mike1021 at gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Thank you for the reply.
>>>>>> The PetscSection and PetscSF objects are defined as in the attached
>>>>>> mock-up code (Q_PetscSF_1.tar). 1-DOF is defined on vertex as my solution
>>>>>> is determined on each vertex with 1-DOF from a finite-volume method.
>>>>>>
>>>>>> As follow up questions:
>>>>>> (1) It seems that the "roots" means the number of vertex not
>>>>>> considering overlap layer, and "leaves" seems the number of distributed
>>>>>> vertex for each processor that includes overlap layer. Can you acknowledge
>>>>>> that this is correct understanding? I have tried to find clearer examples
>>>>>> from PETSc team's articles relevant to Star Forest, but I am still unclear
>>>>>> about the exact relation & graphical notation of roots & leaves in SF if
>>>>>> it's the case of DMPlex solution arrays.
>>>>>>
>>>>>> (2) If it is so, there is an issue that I cannot define "root data"
>>>>>> and "leave data" generally. I am trying to following
>>>>>> "src/vec/is/sf/tutorials/ex1f.F90", however, in that example, size of roots
>>>>>> and leaves are predefined as 6. How can I generalize that? Because I can
>>>>>> get size of leaves using DAG depth(or height), which is equal to number of
>>>>>> vertices each proc has. But, how can I get the size of my "roots" region
>>>>>> from SF? Any example about that? This question is connected to how can I
>>>>>> define "rootdata" for "PetscSFBcastBegin/End()".
>>>>>>
>>>>>> (3) More importantly, with the attached PetscSection & SF layout, my
>>>>>> vector is only resolved for the size equal to "number of roots" for each
>>>>>> proc, but not for the overlapping area(i.e., "leaves"). What I wish to do
>>>>>> is to exchange (or reduce) the solution data between each proc, in the
>>>>>> overlapping region. Can I get some advices why my vector does not encompass
>>>>>> the "leaves" regime? Is there any example doing similar things?
>>>>>>
>>>>>> Thanks,
>>>>>> Mike
>>>>>>
>>>>>>
>>>>>>> On Fri, May 20, 2022 at 4:45 PM Mike Michell <mi.mike1021 at gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Thanks for the reply.
>>>>>>>>
>>>>>>>> > "What I want to do is to exchange data (probably just
>>>>>>>> MPI_Reduce)" which confuses me, because halo exchange is a point-to-point
>>>>>>>> exchange and not a reduction.  Can you clarify?
>>>>>>>> PetscSFReduceBegin/End seems to be the function that do reduction
>>>>>>>> for PetscSF object. What I intended to mention was either reduction or
>>>>>>>> exchange, not specifically intended "reduction".
>>>>>>>>
>>>>>>>> As a follow-up question:
>>>>>>>> Assuming that the code has its own local solution arrays (not Petsc
>>>>>>>> type), and if the plex's DAG indices belong to the halo region are the only
>>>>>>>> information that I want to know (not the detailed section description, such
>>>>>>>> as degree of freedom on vertex, cells, etc.). I have another PetscSection
>>>>>>>> for printing out my solution.
>>>>>>>> Also if I can convert that DAG indices into my local cell/vertex
>>>>>>>> index, can I just use the PetscSF object created from DMGetPointSF(),
>>>>>>>> instead of "creating PetscSection + DMGetSectionSF()"? In other words, can
>>>>>>>> I use the PetscSF object declared from DMGetPointSF() for the halo
>>>>>>>> communication?
>>>>>>>>
>>>>>>>
>>>>>>> No, because that point SF will index information by point number.
>>>>>>> You would need to build a new SF that indexes your dofs. The steps you would
>>>>>>> go through are exactly the same as you would if you just told us
>>>>>>> what the Section is that indexes your data.
>>>>>>>
>>>>>>>   Thanks,
>>>>>>>
>>>>>>>      Matt
>>>>>>>
>>>>>>>
>>>>>>>> Thanks,
>>>>>>>> Mike
>>>>>>>>
>>>>>>>>
>>>>>>>> The PetscSF that is created automatically is the "point sf" (
>>>>>>>>> https://petsc.org/main/docs/manualpages/DM/DMGetPointSF/): it
>>>>>>>>> says which mesh points (cells, faces, edges and vertices) are duplicates of
>>>>>>>>> others.
>>>>>>>>>
>>>>>>>>> In a finite volume application we typically want to assign degrees
>>>>>>>>> of freedom just to cells: some applications may only have one degree of
>>>>>>>>> freedom, others may have multiple.
>>>>>>>>>
>>>>>>>>> You encode where you want degrees of freedom in a PetscSection and
>>>>>>>>> set that as the section for the DM in DMSetLocalSection() (
>>>>>>>>> https://petsc.org/release/docs/manualpages/DM/DMSetLocalSection.html
>>>>>>>>> )
>>>>>>>>>
>>>>>>>>> (A c example of these steps that sets degrees of freedom for
>>>>>>>>> *vertices* instead of cells is `src/dm/impls/plex/tutorials/ex7.c`)
>>>>>>>>>
>>>>>>>>> After that you can call DMGetSectionSF() (
>>>>>>>>> https://petsc.org/main/docs/manualpages/DM/DMGetSectionSF/) to
>>>>>>>>> the the PetscSF that you want for halo exchange: the one for your solution
>>>>>>>>> variables.
>>>>>>>>>
>>>>>>>>> After that, the only calls you typically need in a finite volume
>>>>>>>>> code is PetscSFBcastBegin() to start a halo exchange and PetscSFBcastEnd()
>>>>>>>>> to complete it.
>>>>>>>>>
>>>>>>>>> You say
>>>>>>>>>
>>>>>>>>> > What I want to do is to exchange data (probably just MPI_Reduce)
>>>>>>>>>
>>>>>>>>> which confuses me, because halo exchange is a point-to-point
>>>>>>>>> exchange and not a reduction.  Can you clarify?
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, May 20, 2022 at 8:35 PM Mike Michell <
>>>>>>>>> mi.mike1021 at gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Dear PETSc developer team,
>>>>>>>>>>
>>>>>>>>>> Hi, I am using DMPlex for a finite-volume code and trying to
>>>>>>>>>> understand the usage of PetscSF. What is a typical procedure for doing halo
>>>>>>>>>> data exchange at parallel boundary using PetscSF object on DMPlex? Is there
>>>>>>>>>> any example that I can refer to usage of PetscSF with distributed DMPlex?
>>>>>>>>>>
>>>>>>>>>> Assuming to use the attached mock-up code and mesh, if I give
>>>>>>>>>> "-dm_distribute_overlap 1 -over_dm_view" to run the code, I can see a
>>>>>>>>>> PetscSF object is already created, although I have not called
>>>>>>>>>> "PetscSFCreate" in the code. How can I import & use that PetscSF already
>>>>>>>>>> created by the code to do the halo data exchange?
>>>>>>>>>>
>>>>>>>>>> What I want to do is to exchange data (probably just MPI_Reduce)
>>>>>>>>>> in a parallel boundary region using PetscSF and its functions. I might need
>>>>>>>>>> to have an overlapping layer or not.
>>>>>>>>>>
>>>>>>>>>> Thanks,
>>>>>>>>>> Mike
>>>>>>>>>>
>>>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> What most experimenters take for granted before they begin their
>>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>>> experiments lead.
>>>>>>> -- Norbert Wiener
>>>>>>>
>>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>>>
>>>>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <http://www.cse.buffalo.edu/~knepley/>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220531/23ebaea1/attachment-0001.html>


More information about the petsc-users mailing list