[petsc-users] local to global mapping for DMPlex

Matthew Knepley knepley at gmail.com
Wed Dec 20 16:51:27 CST 2017


I'm glad it worked. Maybe I should take what you have done an abstract it
into something generally
useful for output.

  Thanks,

     Matt

On Wed, Dec 20, 2017 at 4:30 PM, Yann JOBIC <yann.jobic at univ-amu.fr> wrote:

> That's working just fine. I don't know if that could be useful for
> someone, just in case here is the methodology :
>
> I got the cell-vertex connectivity from the function
> DMPlexWriteTopology_Vertices_HDF5_Static of petsc/src/dm/impls/plex/plexhdf5.c.
> The interesting result is in cellIS.
>
> The vertex coordinates are from the function DMPlexWriteCoordinates_HDF5_Static
> of the same file. The global numering is just by adding  the position of
> the vertex in the Vec + all the other ones from previous proc numbers.
> The cell global numbering is trivial : we use the function
> DMPlexGetCellNumbering.
>
> The global numbering is quite strange, but is consistant. Here is some
> examples, for 2x2 Cells (4 vertex per cell).
> I attach a text file for a better reading
> On 1 processor, we have :
> 6-----7-----8
> |      |       |
> |  2  |  3  |
> |      |       |
> 3-----4-----5
> |       |      |
> |  0  |  1   |
> |       |      |
> 0-----1-----2
>
> On 2 processors we have :
> 0-----1-----2
> |       |      |
> |  0  |  1  |
> |       |     |
> 6-----7-----8
> |       |     |
> |  2  |  3  |
> |       |     |
> 3-----4-----5
> Cells 0 and 1 are on proc 0; 2 and 3 are on proc 1.
>
> On 4 processors we have :
> 0-----1-----2
> |       |     |
> |  0  |  1  |
> |       |     |
> 7-----8-----4
> |       |     |
> |  3  |  2  |
> |       |     |
> 5-----6-----3
>
> On cell per proc : cell 0 on proc 0, cell 1 on proc 1, ...
>
> The text file (edit with notepad++) contains all the ISView for
> understanding purpose.
>
> Matthew, many thanks for the help !!
>
> Regards,
>
> Yann
>
> Le 20/12/2017 à 17:40, Matthew Knepley a écrit :
>
> On Wed, Dec 20, 2017 at 11:08 AM, Yann JOBIC <yann.jobic at univ-amu.fr>
> wrote:
>
>> On 12/19/2017 05:50 PM, Matthew Knepley wrote:
>>
>> On Tue, Dec 19, 2017 at 11:40 AM, Yann JOBIC <yann.jobic at univ-amu.fr>
>> wrote:
>>
>>> Hello,
>>>
>>> We want to extract the cell connectivity from a DMPlex. We have no
>>> problem for a sequential run.
>>>
>>
>> Do you want it on disk? If so, you can just DMView() for HDF5. That
>> outputs the connectivity in a global numbering.
>> I can show you the calls I use inside if you want.
>>
>> I looked at the code, specifically DMPlexWriteTopology_Vertices_H
>> DF5_Static
>> cellIS should have what i want.
>> However it seems that it is not the case. Do i look at the right spot in
>> the code ?
>>
>
> cellIS has the entire connectivity. You sound like you want cell-vertex
> only. You could get that by just
>
>   DMPlexUninterpolate(dm, &udm);
>   <get your cellIS from udm>
>   DMDestroy(&udm);
>
>
>> I also looked at DMPlexGetCellNumbering, which does exactly what i want
>> for the global numbering of Cells, even if the ordering is different for
>> different number of processors.
>>
>
> Yes.
>
>
>> When i use DMPlexGetVertexNumbering, i've got negative values, which is
>> correctly handeled by DMPlexWriteTopology_Vertices_HDF5_Static, but i
>> really don't understand this part.
>>
>
> There are no shared cells in your partition, so you get all positive
> numbers. You have shared vertices, so the negative numbers are for vertices
> you do not own. If the negative number is n, the corresponding global
> number is -(n+1). Notice that the transformation is 1-1 and its square is I.
>
>
>> I succeed in having the coordinates, but maybe not in the right order. I
>> reused DMPlexWriteCoordinates_HDF5_Static, which create a Vec of vertex
>> coordinates, but i couldn't understand the order of vertex coordinate
>> linked (or not) with DMPlexWriteTopology_Vertices_HDF5_Static. The
>> number of local coordinates is also strange, and does not correspond to the
>> cell's topology.
>>
>
> It matches the order of the global numbering.
>
>   Thanks
>
>     Matt
>
>
>> Thanks for the help!
>>
>> Regards,
>>
>> Yann
>>
>>
>> I usually put
>>
>>   DMViewFromOptions(dm, NULL, "-dm_view")
>>
>> Then
>>
>>   -dm_view hdf5:mesh.h5
>>
>>   Thanks,
>>
>>     Matt
>>
>>
>>> However for parallel ones, we need to get the node numbering in the
>>> global ordering, as when we distribute the mesh, we only have local nodes,
>>> and thus local numbering.
>>>
>>> It seems that we should use DMGetLocalToGlobalMapping (we are using
>>> Fortran with Petsc 3.8p3). However, we get the running error :
>>>
>>> [0]PETSC ERROR: No support for this operation for this object type
>>> [0]PETSC ERROR: DM can not create LocalToGlobalMapping
>>>
>>> Is it the right way to do it ?
>>>
>>> Many thanks,
>>>
>>> Regards,
>>>
>>> Yann
>>>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/%7Emk51/>
>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/%7Emk51/>
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20171220/632a7b86/attachment.html>


More information about the petsc-users mailing list