<div dir="ltr">As a follow-up question on your example, is it possible to call PetscSFCreateRemoteOffsets() from Fortran? <br><br>My code is written in .F90 and in "petsc/finclude/" there is no petscsf.h so that the code currently cannot find PetscSFCreateRemoteOffsets(). <br><br>Thanks,<br>Mike<br></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">2022년 5월 24일 (화) 오후 8:46, Matthew Knepley <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>>님이 작성:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">I will also point out that Toby has created a nice example showing how to create an SF for halo exchange between local vectors.<div><br></div><div> <a href="https://gitlab.com/petsc/petsc/-/merge_requests/5267" target="_blank">https://gitlab.com/petsc/petsc/-/merge_requests/5267</a></div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Sun, May 22, 2022 at 9:47 PM Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">On Sun, May 22, 2022 at 4:28 PM Mike Michell <<a href="mailto:mi.mike1021@gmail.com" target="_blank">mi.mike1021@gmail.com</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">Thanks for the reply. The diagram makes sense and is helpful for understanding 1D representation. <br><br>However, something is still unclear. From your diagram, the number of roots per process seems to vary according to run arguments, such as "-dm_distribute_overlap", because "the number of roots for a DMPlex is the number of mesh points in the local portion of the mesh (cited from your answer to my question (1))" will end up change according to that argument. However, from my mock-up code, number of roots is independent to -dm_distribute_overlap argument. The summation of "number of roots" through processes was always equal to number of physical vertex on my mesh, if I define the section layout on vertex with 1DOF. But in your diagram example, the summation of "nroots" is larger than the actual number of mesh points, which is 13.<br></div></div></blockquote><div><br></div><div>I do not understand your question. Notice the -dm_distribute_overlap does _not_ change the owned points for any process. It only puts in new leaves, so it also never</div><div>changes the roots for this way of using the SF.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">Also, it is still unclear how to get the size of "roots" from the PetscSection & PetscSF on distributed DMPlex?</div></div></blockquote><div><br></div><div>For an SF mapping ghost dofs in a global vector, the number of roots is just the size of the local portion of the vector.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr"> In your diagram, how can you tell your code and make it allocate the "nroots=7 for P0, nroots=9 for P1, and nroots=7 for P2" arrays before you call PetscSFBcastBegin/End()? It seems that we need to define arrays having the size of nroots & nleaves before calling PetscSFBcastBegin/End().<br></div></div></blockquote><div><br></div><div>I just want to note that this usage is different from the canonical usage in Plex. It is fine to do this, but this will not match what I do in the library if you look.</div><div>In Plex, I distinguish two linear spaces:</div><div><br></div><div> 1) Global space: This is the vector space for the solvers. Each point is uniquely represented and owned by some process</div><div><br></div><div> 2) Local space: This is the vector space for assembly. Some points are represented multiple times.</div><div><br></div><div>I create an SF that maps from the global space (roots) to the local space (leaves), and it is called in DMGlobalToLocal() (and associated functions). This</div><div>is more natural in FEM. You seem to want an SF that maps between global vectors. This will also work. The roots would be the local dofs, and the leaves</div><div>would be shared dofs.</div><div><br></div><div> Does this make sense?</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">Thanks,<br>Mike<br></div><div class="gmail_quote"><div dir="ltr" class="gmail_attr"><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>Here's a diagram of a 1D mesh with overlap and 3 partitions, showing what the petscsf data is for each. The number of roots is the number of mesh points in the local representation, and the number of leaves is the number of mesh points that are duplicates of mesh points on other processes. With that in mind, answering your questions</div><div><br></div><div>> (1) It seems that the "roots" means the number of vertex not considering
overlap layer, and "leaves" seems the number of distributed vertex for
each processor that includes overlap layer. Can you acknowledge that
this is correct understanding? I have tried to find clearer examples
from PETSc team's articles relevant to Star Forest, but I am still
unclear about the exact relation & graphical notation of roots &
leaves in SF if it's the case of DMPlex solution arrays. <br></div><div><br></div><div>No, the number of roots for a DMPlex is the number of mesh points in the local portion of the mesh</div><div><br></div><div>> (2) If it is so, there is an issue that I cannot define "root data" and
"leave data" generally. I am trying to following
"src/vec/is/sf/tutorials/ex1f.F90", however, in that example, size
of roots and leaves are predefined as 6. How can I generalize that?
Because I can get size of leaves using DAG depth(or height), which is
equal to number of vertices each proc has. But, how can I get the size
of my "roots" region from SF? Any example about that? This question is
connected to how can I define "rootdata" for "PetscSFBcastBegin/End()".</div><div><br></div><div>Does the diagram help you generalize?</div><div><br></div><div>> (3) More importantly, with the attached PetscSection & SF layout, my
vector is only resolved for the size equal to "number of roots" for
each proc, but not for the overlapping area(i.e., "leaves"). What I wish
to do is to exchange (or reduce) the solution data between each proc,
in the overlapping region. Can I get some advices why my vector does not
encompass the "leaves" regime? Is there any example doing similar
things?</div><div></div><div>Going back to my first response: if you use a section to say how many pieces of data are associated with each local mesh point, then a PetscSF is constructed that requires no more manipulation from you.</div><div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Sun, May 22, 2022 at 10:47 AM Mike Michell <<a href="mailto:mi.mike1021@gmail.com" target="_blank">mi.mike1021@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">Thank you for the reply. <br>The PetscSection and PetscSF objects are defined as in the attached mock-up code (Q_PetscSF_1.tar). 1-DOF is defined on vertex as my solution is determined on each vertex with 1-DOF from a finite-volume method. <br><br>As follow up questions: <br>(1) It seems that the "roots" means the number of vertex not considering overlap layer, and "leaves" seems the number of distributed vertex for each processor that includes overlap layer. Can you acknowledge that this is correct understanding? I have tried to find clearer examples from PETSc team's articles relevant to Star Forest, but I am still unclear about the exact relation & graphical notation of roots & leaves in SF if it's the case of DMPlex solution arrays. <br><br>(2) If it is so, there is an issue that I cannot define "root data" and "leave data" generally. I am trying to following "src/vec/is/sf/tutorials/ex1f.F90", however, in that example, size of roots and leaves are predefined as 6. How can I generalize that? Because I can get size of leaves using DAG depth(or height), which is equal to number of vertices each proc has. But, how can I get the size of my "roots" region from SF? Any example about that? This question is connected to how can I define "rootdata" for "PetscSFBcastBegin/End()".<br><br>(3) More importantly, with the attached PetscSection & SF layout, my vector is only resolved for the size equal to "number of roots" for each proc, but not for the overlapping area(i.e., "leaves"). What I wish to do is to exchange (or reduce) the solution data between each proc, in the overlapping region. Can I get some advices why my vector does not encompass the "leaves" regime? Is there any example doing similar things?<br><br>Thanks,<br>Mike<br></div><br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr"><br>On Fri, May 20, 2022 at 4:45 PM Mike Michell <<a href="mailto:mi.mike1021@gmail.com" target="_blank">mi.mike1021@gmail.com</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">Thanks for the reply. <br><br>> "What I want to do is to exchange data (probably just MPI_Reduce)" which confuses me, because halo exchange is a point-to-point exchange and not a reduction. Can you clarify?<br>PetscSFReduceBegin/End seems to be the function that do reduction for PetscSF object. What I intended to mention was either reduction or exchange, not specifically intended "reduction".<br><br>As a follow-up question: <br>Assuming that the code has its own local solution arrays (not Petsc type), and if the plex's DAG indices belong to the halo region are the only information that I want to know (not the detailed section description, such as degree of freedom on vertex, cells, etc.). I have another PetscSection for printing out my solution. <br>Also if I can convert that DAG indices into my local cell/vertex index, can I just use the PetscSF object created from DMGetPointSF(), instead of "creating PetscSection + DMGetSectionSF()"? In other words, can I use the PetscSF object declared from DMGetPointSF() for the halo communication?<br></div></div></blockquote><div><br></div><div>No, because that point SF will index information by point number. You would need to build a new SF that indexes your dofs. The steps you would</div><div>go through are exactly the same as you would if you just told us what the Section is that indexes your data.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">Thanks,<br>Mike<br></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr"><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>The PetscSF that is created automatically is the "point sf" (<a href="https://petsc.org/main/docs/manualpages/DM/DMGetPointSF/" target="_blank">https://petsc.org/main/docs/manualpages/DM/DMGetPointSF/</a>): it says which mesh points (cells, faces, edges and vertices) are duplicates of others.</div><div><br></div><div>In a finite volume application we typically want to assign degrees of freedom just to cells: some applications may only have one degree of freedom, others may have multiple.</div><div><br></div><div>You encode where you want degrees of freedom in a PetscSection and set that as the section for the DM in DMSetLocalSection() (<a href="https://petsc.org/release/docs/manualpages/DM/DMSetLocalSection.html" target="_blank">https://petsc.org/release/docs/manualpages/DM/DMSetLocalSection.html</a>)</div><div><br></div><div>(A c example of these steps that sets degrees of freedom for *vertices* instead of cells is `src/dm/impls/plex/tutorials/ex7.c`)<br></div><div><br></div><div>After that you can call DMGetSectionSF() (<a href="https://petsc.org/main/docs/manualpages/DM/DMGetSectionSF/" target="_blank">https://petsc.org/main/docs/manualpages/DM/DMGetSectionSF/</a>) to the the PetscSF that you want for halo exchange: the one for your solution variables.</div><div><br></div><div>After that, the only calls you typically need in a finite volume code is PetscSFBcastBegin() to start a halo exchange and PetscSFBcastEnd() to complete it.</div><div><br></div><div>You say</div><div><br></div><div>> What I want to do is to exchange data (probably just MPI_Reduce)</div><div><br></div><div>which confuses me, because halo exchange is a point-to-point exchange and not a reduction. Can you clarify?<br> </div><div><br></div><div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Fri, May 20, 2022 at 8:35 PM Mike Michell <<a href="mailto:mi.mike1021@gmail.com" target="_blank">mi.mike1021@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Dear PETSc developer team,<br><br>Hi, I am using DMPlex for a finite-volume code and trying to understand the usage of PetscSF. What is a typical procedure for doing halo data exchange at parallel boundary using PetscSF object on DMPlex? Is there any example that I can refer to usage of PetscSF with distributed DMPlex? <br><br>Assuming to use the attached mock-up code and mesh, if I give "-dm_distribute_overlap 1 -over_dm_view" to run the code, I can see a PetscSF object is already created, although I have not called "PetscSFCreate" in the code. How can I import & use that PetscSF already created by the code to do the halo data exchange?<br><br>What I want to do is to exchange data (probably just MPI_Reduce) in a parallel boundary region using PetscSF and its functions. I might need to have an overlapping layer or not. <br><br>Thanks,<br>Mike<br></div>
</blockquote></div>
</blockquote></div></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>
</blockquote></div></div>
</blockquote></div>
</blockquote></div></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div>
</blockquote></div>