[petsc-users] DMPlex: a Mapping Array between Natural and Distributed Cell Index

Bora Jeong boraj1021 at gmail.com
Fri Jul 15 08:46:39 CDT 2022


Okay, I got it what's happening. First, this very bright functionality of
petsc (the natural-global converting) needs to be documented in a better
way for sure. Currently, it is very difficult to use/follow this features
as an external user. Hope this will move forward in a better way.

Then next question is if I load/distribute mesh just like I am doing right
now shown in the sample code, can I assume that my code does not create
natural sf during the distribution(always)? In other words, can I
always get the natural order of each node by simply stacking the preceding
processor's number of node? For example, for proc=0, natural node ID might
be just 1 to nnode_proc_0,
for proc=1, it might be {nnode_proc_0 + 1 to nnode_proc_0 + nnode_proc_1}
and so on.

Does that always make sense in this case?

Best


On Fri, Jul 15, 2022 at 8:07 AM Matthew Knepley <knepley at gmail.com> wrote:

> On Fri, Jul 15, 2022 at 7:17 AM Bora Jeong <boraj1021 at gmail.com> wrote:
>
>> A sample code for loading dm and declaring Natural SF from a sphere mesh
>> is attached here. PetscSFView() returns NULL from sf_nat.
>>
>
> The Global-to-Natural mapping relates global dofs to natural dofs. Thus,
> in order to compute this mapping the DM has to know
> about the dof layout before distribution. This means you need to setup a
> local section before distributing, as we do for example in
> Plex test ex15. This makes things more complicated since everything cannot
> be packaged up into DMSetFromOptions(), but we need
> user input so I do not see a simpler way to do things.
>
>   Thanks,
>
>     Matt
>
>
>> Best
>>
>> On Fri, Jul 15, 2022 at 6:39 AM Matthew Knepley <knepley at gmail.com>
>> wrote:
>>
>>> On Thu, Jul 14, 2022 at 8:25 PM Bora Jeong <boraj1021 at gmail.com> wrote:
>>>
>>>> Okay, I checked it and you are correct. In my case, simply, natural
>>>> node index can be identified by stacking all the preceding processor's
>>>> numbers of nodes for a particular processor, which is good due to
>>>> simplicity. However, one serious question is why this is happening in my
>>>> code? In other words, why the natural SF is not created during the mesh
>>>> distribution? My code wants to have consistency in dealing with this
>>>> natural indexing for several different kinds of mesh files. So there is a
>>>> necessity to guarantee of consistency in this weird behavior.
>>>>
>>>
>>> I can't tell what is going on in your code unless I can run it. Do you
>>> have a simple example?
>>>
>>>   Thanks,
>>>
>>>      Matt
>>>
>>>
>>>> Best,
>>>>
>>>> On Thu, Jul 14, 2022 at 6:43 PM Matthew Knepley <knepley at gmail.com>
>>>> wrote:
>>>>
>>>>> On Thu, Jul 14, 2022 at 5:47 PM Bora Jeong <boraj1021 at gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Thank you for the comments. I have these errors when I call
>>>>>> PetscSFView() after DMGetNaturalSF()
>>>>>>
>>>>>> [2]PETSC ERROR: --------------------- Error Message
>>>>>> --------------------------------------------------------------
>>>>>> [2]PETSC ERROR: Null argument, when expecting valid pointer
>>>>>> [2]PETSC ERROR: Null Pointer: Parameter # 1
>>>>>> [2]PETSC ERROR: See https://petsc.org/release/faq/ for trouble
>>>>>> shooting.
>>>>>> [2]PETSC ERROR: Petsc Release Version 3.17.0, unknown
>>>>>> [2]PETSC ERROR: #1 PetscSFView() at
>>>>>> [2]PETSC ERROR: #2 User provided function() at User file:0
>>>>>> Abort(85) on node 2 (rank 0 in comm 16): application called
>>>>>> MPI_Abort(MPI_COMM_SELF, 85) - process 0
>>>>>>
>>>>>
>>>>> Clearly NULL was returned, which means no natural SF was created.
>>>>>
>>>>>
>>>>>> Below is the structure to load a mesh file from gmsh;
>>>>>>
>>>>>>   call DMCreate(PETSC_COMM_WORLD, dm, ierr);CHKERRA(ierr)
>>>>>>   call DMSetType(dm, plex, ierr);CHKERRA(ierr)
>>>>>>   call DMSetUseNatural(dm, PETSC_TRUE, ierr);CHKERRA(ierr)
>>>>>>   call DMSetFromOptions(dm, ierr);CHKERRA(ierr)
>>>>>>   call DMGetNaturalSF(dm, sf_nat, ierr);CHKERRA(ierr)
>>>>>>   call PetscSFView(sf_nat, PETSC_VIEWER_STDOUT_WORLD,
>>>>>> ierr);CHKERRA(ierr)
>>>>>>
>>>>>
>>>>> The natural SF is created during mesh distribution. That has not
>>>>> happened here. This means that
>>>>> the order of cells is identical to the file it was read from.
>>>>>
>>>>>   Thanks,
>>>>>
>>>>>       Matt
>>>>>
>>>>>
>>>>>> Best
>>>>>>
>>>>>> On Thu, Jul 14, 2022 at 10:49 AM Matthew Knepley <knepley at gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> On Wed, Jul 13, 2022 at 10:17 PM Bora Jeong <boraj1021 at gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Dear petsc team,
>>>>>>>>
>>>>>>>> I am a user of DMPlex for a finite volume code and there is a
>>>>>>>> necessity to know global index of each cell. Here the global index means
>>>>>>>> the indexing that can be found from a mesh file itself without distribution
>>>>>>>> over processors. It seems petsc community denotes this indexing term as
>>>>>>>> "natural".
>>>>>>>>
>>>>>>>> What I want to do is to create a local array (not petsc vector but
>>>>>>>> just an array variable in the program) to map distributed cell ID to natual
>>>>>>>> cell ID, for example, an array "A";
>>>>>>>> A(distributed_node_ID) = natural_node_ID
>>>>>>>>
>>>>>>>> There are some petsc functions to support mapping between global
>>>>>>>> and natural vectors. However, I just need to define the array "A" as above
>>>>>>>> example. To achieve this, what is a proper/smart way? In other words, how
>>>>>>>> can I extract the natural_cell_ID from a distributed local_cell_ID?
>>>>>>>>
>>>>>>>> I turned on DMSetUseNatural(DM, PETSC_TRUE) before distribution,
>>>>>>>> but after this, defining all the required section and star forest objects
>>>>>>>> to get natural and global vectors seems not that direct way for my purpose,
>>>>>>>> which is just to extract the above mapping array "A". Can I get any
>>>>>>>> comments about it?
>>>>>>>>
>>>>>>>
>>>>>>> There is only one thing created, the sfNatural PetscSF object, which
>>>>>>> you can get with DMGetNaturalSF(). The roots of this SF are
>>>>>>> the global numbers of dofs stored in PETSc vectors, and the leaves
>>>>>>> are natural numbers for these dofs. Thus, when we map global
>>>>>>> vectors to natural vectors in DMPlexGlobalToNaturalBegin/End(), we
>>>>>>> call PetscSFBcastBegin/End(). Mapping natural to global we call
>>>>>>> PetscSFReduceBegin/End(). You could pull the information out of the
>>>>>>> SF using PetscSFGetGraph() if you want.
>>>>>>>
>>>>>>>   Thanks,
>>>>>>>
>>>>>>>     Matt
>>>>>>>
>>>>>>>
>>>>>>>> Regards
>>>>>>>> Mo
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> What most experimenters take for granted before they begin their
>>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>>> experiments lead.
>>>>>>> -- Norbert Wiener
>>>>>>>
>>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>>>
>>>>>>
>>>>>
>>>>> --
>>>>> What most experimenters take for granted before they begin their
>>>>> experiments is infinitely more interesting than any results to which their
>>>>> experiments lead.
>>>>> -- Norbert Wiener
>>>>>
>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>>
>>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <http://www.cse.buffalo.edu/~knepley/>
>>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220715/19bae8af/attachment-0001.html>


More information about the petsc-users mailing list