[petsc-users] DMPlex: a Mapping Array between Natural and Distributed Cell Index

Matthew Knepley knepley at gmail.com
Mon Jul 18 15:50:13 CDT 2022


On Mon, Jul 18, 2022 at 12:14 PM Bora Jeong <boraj1021 at gmail.com> wrote:

> Thank you for the corrections. It works.
>
> However, I still have a problem.
> I tested zero stratum for DMGetStratumIS() with the default depth label,
> since the graph of vertex is needed. Then, PetscSFGetGraph() from the
> Natural SF is crashed. I checked that pBcCompIS contains proper set of
> integers via ISView().
>
> On the other hand, it works okay if DMGetStratumIS() is with dim stratum,
> which is for cell stratum.
> The problem is that if the dim stratum is used, the number of entry in
> ileaves(i)% pointer does not match with the actual number of vertex that
> each processor has. That is why I tried to put the zero stratum on
> DMGetStratumIS(), instead of the stratum for cell (i.e., "dim") to see if
> any changes on the graph. Can I get comments?
>

I cannot understand your question. I am not sure what you are using the set
of vertices or cell for.

  Thanks,

     Matt


> Best
>
> On Sun, Jul 17, 2022 at 9:59 AM Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Fri, Jul 15, 2022 at 7:05 PM Bora Jeong <boraj1021 at gmail.com> wrote:
>>
>>> I found that iroots() and ileaves() have size of nleaves and tested
>>> through the attached sample code with the grid previously shared. Then I
>>> get the results written in the attached monitor file.
>>>
>>> It seems ileaves(i)%rank and ileaves(i)%index at line 127 of the
>>> attached code have garbage values, different from displayed values by
>>> PetscSFView() at line 113.
>>>
>>> It is tough to get it why the garbage values are returned from
>>> PetscSFGetGraph(). Any comments will be very appreciated.
>>>
>>
>> Unfortunately, Fortran is not very good at reporting declaration errors.
>> The problem is that you did not include or use the Vec module. I have done
>> this and your example runs for me. I have included the modified code.
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>>> Best
>>>
>>> On Fri, Jul 15, 2022 at 3:53 PM Matthew Knepley <knepley at gmail.com>
>>> wrote:
>>>
>>>> On Fri, Jul 15, 2022 at 2:25 PM Bora Jeong <boraj1021 at gmail.com> wrote:
>>>>
>>>>> Thank you for the comments. Updated sample code is attached here;
>>>>> DMPlexDistribute() is explicitly called and by defining a section before
>>>>> mesh distribution, a Natural SF was able to be defined as found from
>>>>> PetscSFView().
>>>>>
>>>>> However, I am still struggling to call PetscSFGetGraph()
>>>>> https://petsc.org/main/docs/manualpages/PetscSF/PetscSFGetGraph/
>>>>> due to data type definition of ilocal and iremote. What is the proper
>>>>> size allocation for those two variables? Does this size for the number of
>>>>> processors? or the number of vertex?
>>>>>
>>>>
>>>> Since we need to pass back arrays, you need to pass us in F90 pointers.
>>>> Here is an example of doing such a thing:
>>>>
>>>>
>>>> https://gitlab.com/petsc/petsc/-/blob/main/src/vec/is/sf/tutorials/ex1f.F90#L94
>>>>
>>>>   Thanks,
>>>>
>>>>       Matt
>>>>
>>>>
>>>>> Best
>>>>>
>>>>> On Fri, Jul 15, 2022 at 9:09 AM Matthew Knepley <knepley at gmail.com>
>>>>> wrote:
>>>>>
>>>>>> On Fri, Jul 15, 2022 at 8:46 AM Bora Jeong <boraj1021 at gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Okay, I got it what's happening. First, this very bright
>>>>>>> functionality of petsc (the natural-global converting) needs to be
>>>>>>> documented in a better way for sure. Currently, it is very difficult to
>>>>>>> use/follow this features as an external user. Hope this will move forward
>>>>>>> in a better way.
>>>>>>>
>>>>>>> Then next question is if I load/distribute mesh just like I am doing
>>>>>>> right now shown in the sample code, can I assume that my code does not
>>>>>>> create natural sf during the distribution(always)? In other words, can I
>>>>>>> always get the natural order of each node by simply stacking the preceding
>>>>>>> processor's number of node? For example, for proc=0, natural node ID might
>>>>>>> be just 1 to nnode_proc_0,
>>>>>>> for proc=1, it might be {nnode_proc_0 + 1 to nnode_proc_0 +
>>>>>>> nnode_proc_1} and so on.
>>>>>>>
>>>>>>> Does that always make sense in this case?
>>>>>>>
>>>>>>
>>>>>> No, but if you call DMPlexDistribute() yourself, rather than having
>>>>>> it called automatically by DMSetFromOptions(), you can
>>>>>> preserve the mapping:
>>>>>>
>>>>>>   https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexDistribute/
>>>>>>
>>>>>> The SF returned maps the original point distribution, which is in the
>>>>>> same order as the file, to the redistributed points, which are determined
>>>>>> by the mesh partitioner.
>>>>>>
>>>>>>   Thanks,
>>>>>>
>>>>>>       Matt
>>>>>>
>>>>>>
>>>>>>> Best
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Jul 15, 2022 at 8:07 AM Matthew Knepley <knepley at gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> On Fri, Jul 15, 2022 at 7:17 AM Bora Jeong <boraj1021 at gmail.com>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> A sample code for loading dm and declaring Natural SF from a
>>>>>>>>> sphere mesh is attached here. PetscSFView() returns NULL from sf_nat.
>>>>>>>>>
>>>>>>>>
>>>>>>>> The Global-to-Natural mapping relates global dofs to natural dofs.
>>>>>>>> Thus, in order to compute this mapping the DM has to know
>>>>>>>> about the dof layout before distribution. This means you need to
>>>>>>>> setup a local section before distributing, as we do for example in
>>>>>>>> Plex test ex15. This makes things more complicated since everything
>>>>>>>> cannot be packaged up into DMSetFromOptions(), but we need
>>>>>>>> user input so I do not see a simpler way to do things.
>>>>>>>>
>>>>>>>>   Thanks,
>>>>>>>>
>>>>>>>>     Matt
>>>>>>>>
>>>>>>>>
>>>>>>>>> Best
>>>>>>>>>
>>>>>>>>> On Fri, Jul 15, 2022 at 6:39 AM Matthew Knepley <knepley at gmail.com>
>>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>>> On Thu, Jul 14, 2022 at 8:25 PM Bora Jeong <boraj1021 at gmail.com>
>>>>>>>>>> wrote:
>>>>>>>>>>
>>>>>>>>>>> Okay, I checked it and you are correct. In my case, simply,
>>>>>>>>>>> natural node index can be identified by stacking all the preceding
>>>>>>>>>>> processor's numbers of nodes for a particular processor, which is good due
>>>>>>>>>>> to simplicity. However, one serious question is why this is happening in my
>>>>>>>>>>> code? In other words, why the natural SF is not created during the mesh
>>>>>>>>>>> distribution? My code wants to have consistency in dealing with this
>>>>>>>>>>> natural indexing for several different kinds of mesh files. So there is a
>>>>>>>>>>> necessity to guarantee of consistency in this weird behavior.
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> I can't tell what is going on in your code unless I can run it.
>>>>>>>>>> Do you have a simple example?
>>>>>>>>>>
>>>>>>>>>>   Thanks,
>>>>>>>>>>
>>>>>>>>>>      Matt
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>> Best,
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Jul 14, 2022 at 6:43 PM Matthew Knepley <
>>>>>>>>>>> knepley at gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Jul 14, 2022 at 5:47 PM Bora Jeong <boraj1021 at gmail.com>
>>>>>>>>>>>> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Thank you for the comments. I have these errors when I call
>>>>>>>>>>>>> PetscSFView() after DMGetNaturalSF()
>>>>>>>>>>>>>
>>>>>>>>>>>>> [2]PETSC ERROR: --------------------- Error Message
>>>>>>>>>>>>> --------------------------------------------------------------
>>>>>>>>>>>>> [2]PETSC ERROR: Null argument, when expecting valid pointer
>>>>>>>>>>>>> [2]PETSC ERROR: Null Pointer: Parameter # 1
>>>>>>>>>>>>> [2]PETSC ERROR: See https://petsc.org/release/faq/ for
>>>>>>>>>>>>> trouble shooting.
>>>>>>>>>>>>> [2]PETSC ERROR: Petsc Release Version 3.17.0, unknown
>>>>>>>>>>>>> [2]PETSC ERROR: #1 PetscSFView() at
>>>>>>>>>>>>> [2]PETSC ERROR: #2 User provided function() at User file:0
>>>>>>>>>>>>> Abort(85) on node 2 (rank 0 in comm 16): application called
>>>>>>>>>>>>> MPI_Abort(MPI_COMM_SELF, 85) - process 0
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> Clearly NULL was returned, which means no natural SF was
>>>>>>>>>>>> created.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>> Below is the structure to load a mesh file from gmsh;
>>>>>>>>>>>>>
>>>>>>>>>>>>>   call DMCreate(PETSC_COMM_WORLD, dm, ierr);CHKERRA(ierr)
>>>>>>>>>>>>>   call DMSetType(dm, plex, ierr);CHKERRA(ierr)
>>>>>>>>>>>>>   call DMSetUseNatural(dm, PETSC_TRUE, ierr);CHKERRA(ierr)
>>>>>>>>>>>>>   call DMSetFromOptions(dm, ierr);CHKERRA(ierr)
>>>>>>>>>>>>>   call DMGetNaturalSF(dm, sf_nat, ierr);CHKERRA(ierr)
>>>>>>>>>>>>>   call PetscSFView(sf_nat, PETSC_VIEWER_STDOUT_WORLD,
>>>>>>>>>>>>> ierr);CHKERRA(ierr)
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> The natural SF is created during mesh distribution. That has
>>>>>>>>>>>> not happened here. This means that
>>>>>>>>>>>> the order of cells is identical to the file it was read from.
>>>>>>>>>>>>
>>>>>>>>>>>>   Thanks,
>>>>>>>>>>>>
>>>>>>>>>>>>       Matt
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>> Best
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Jul 14, 2022 at 10:49 AM Matthew Knepley <
>>>>>>>>>>>>> knepley at gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Wed, Jul 13, 2022 at 10:17 PM Bora Jeong <
>>>>>>>>>>>>>> boraj1021 at gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Dear petsc team,
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> I am a user of DMPlex for a finite volume code and there is
>>>>>>>>>>>>>>> a necessity to know global index of each cell. Here the global index means
>>>>>>>>>>>>>>> the indexing that can be found from a mesh file itself without distribution
>>>>>>>>>>>>>>> over processors. It seems petsc community denotes this indexing term as
>>>>>>>>>>>>>>> "natural".
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> What I want to do is to create a local array (not petsc
>>>>>>>>>>>>>>> vector but just an array variable in the program) to map distributed cell
>>>>>>>>>>>>>>> ID to natual cell ID, for example, an array "A";
>>>>>>>>>>>>>>> A(distributed_node_ID) = natural_node_ID
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> There are some petsc functions to support mapping between
>>>>>>>>>>>>>>> global and natural vectors. However, I just need to define the array "A" as
>>>>>>>>>>>>>>> above example. To achieve this, what is a proper/smart way? In other words,
>>>>>>>>>>>>>>> how can I extract the natural_cell_ID from a distributed local_cell_ID?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> I turned on DMSetUseNatural(DM, PETSC_TRUE) before
>>>>>>>>>>>>>>> distribution, but after this, defining all the required section and star
>>>>>>>>>>>>>>> forest objects to get natural and global vectors seems not that direct way
>>>>>>>>>>>>>>> for my purpose, which is just to extract the above mapping array "A". Can I
>>>>>>>>>>>>>>> get any comments about it?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> There is only one thing created, the sfNatural PetscSF
>>>>>>>>>>>>>> object, which you can get with DMGetNaturalSF(). The roots of this SF are
>>>>>>>>>>>>>> the global numbers of dofs stored in PETSc vectors, and the
>>>>>>>>>>>>>> leaves are natural numbers for these dofs. Thus, when we map global
>>>>>>>>>>>>>> vectors to natural vectors
>>>>>>>>>>>>>> in DMPlexGlobalToNaturalBegin/End(), we call PetscSFBcastBegin/End().
>>>>>>>>>>>>>> Mapping natural to global we call
>>>>>>>>>>>>>> PetscSFReduceBegin/End(). You could pull the information out
>>>>>>>>>>>>>> of the SF using PetscSFGetGraph() if you want.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>   Thanks,
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>     Matt
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>> Mo
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> --
>>>>>>>>>>>>>> What most experimenters take for granted before they begin
>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which
>>>>>>>>>>>>>> their experiments lead.
>>>>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>>>>>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> --
>>>>>>>>>>>> What most experimenters take for granted before they begin
>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which
>>>>>>>>>>>> their experiments lead.
>>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>>
>>>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>>>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>> What most experimenters take for granted before they begin their
>>>>>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>>>>>> experiments lead.
>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>
>>>>>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> What most experimenters take for granted before they begin their
>>>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>>>> experiments lead.
>>>>>>>> -- Norbert Wiener
>>>>>>>>
>>>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>> --
>>>>>> What most experimenters take for granted before they begin their
>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>> experiments lead.
>>>>>> -- Norbert Wiener
>>>>>>
>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>>
>>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>> https://www.cse.buffalo.edu/~knepley/
>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220718/bf95e8b9/attachment-0001.html>


More information about the petsc-users mailing list