[petsc-users] local to global mapping for DMPlex

Yann JOBIC yann.jobic at univ-amu.fr
Wed Dec 20 15:30:19 CST 2017


That's working just fine. I don't know if that could be useful for 
someone, just in case here is the methodology :

I got the cell-vertex connectivity from the function 
DMPlexWriteTopology_Vertices_HDF5_Static of 
petsc/src/dm/impls/plex/plexhdf5.c. The interesting result is in cellIS.

The vertex coordinates are from the function 
DMPlexWriteCoordinates_HDF5_Static of the same file. The global numering 
is just by adding  the position of the vertex in the Vec + all the other 
ones from previous proc numbers.

The cell global numbering is trivial : we use the function 
DMPlexGetCellNumbering.

The global numbering is quite strange, but is consistant. Here is some 
examples, for 2x2 Cells (4 vertex per cell).
I attach a text file for a better reading
On 1 processor, we have :
6-----7-----8
|      |       |
|  2  |  3  |
|      |       |
3-----4-----5
|       |      |
|  0  |  1   |
|       |      |
0-----1-----2

On 2 processors we have :
0-----1-----2
|       |      |
|  0  |  1  |
|       |     |
6-----7-----8
|       |     |
|  2  |  3  |
|       |     |
3-----4-----5
Cells 0 and 1 are on proc 0; 2 and 3 are on proc 1.

On 4 processors we have :
0-----1-----2
|       |     |
|  0  |  1  |
|       |     |
7-----8-----4
|       |     |
|  3  |  2  |
|       |     |
5-----6-----3

On cell per proc : cell 0 on proc 0, cell 1 on proc 1, ...

The text file (edit with notepad++) contains all the ISView for 
understanding purpose.

Matthew, many thanks for the help !!

Regards,

Yann

Le 20/12/2017 à 17:40, Matthew Knepley a écrit :
> On Wed, Dec 20, 2017 at 11:08 AM, Yann JOBIC <yann.jobic at univ-amu.fr 
> <mailto:yann.jobic at univ-amu.fr>> wrote:
>
>     On 12/19/2017 05:50 PM, Matthew Knepley wrote:
>>     On Tue, Dec 19, 2017 at 11:40 AM, Yann JOBIC
>>     <yann.jobic at univ-amu.fr <mailto:yann.jobic at univ-amu.fr>> wrote:
>>
>>         Hello,
>>
>>         We want to extract the cell connectivity from a DMPlex. We
>>         have no problem for a sequential run.
>>
>>
>>     Do you want it on disk? If so, you can just DMView() for HDF5.
>>     That outputs the connectivity in a global numbering.
>>     I can show you the calls I use inside if you want.
>     I looked at the code, specifically
>     DMPlexWriteTopology_Vertices_HDF5_Static
>     cellIS should have what i want.
>     However it seems that it is not the case. Do i look at the right
>     spot in the code ?
>
>
> cellIS has the entire connectivity. You sound like you want 
> cell-vertex only. You could get that by just
>
>   DMPlexUninterpolate(dm, &udm);
>   <get your cellIS from udm>
>   DMDestroy(&udm);
>
>     I also looked at DMPlexGetCellNumbering, which does exactly what i
>     want for the global numbering of Cells, even if the ordering is
>     different for different number of processors.
>
>
> Yes.
>
>     When i use DMPlexGetVertexNumbering, i've got negative values,
>     which is correctly handeled by
>     DMPlexWriteTopology_Vertices_HDF5_Static, but i really don't
>     understand this part.
>
>
> There are no shared cells in your partition, so you get all positive 
> numbers. You have shared vertices, so the negative numbers are for 
> vertices
> you do not own. If the negative number is n, the corresponding global 
> number is -(n+1). Notice that the transformation is 1-1 and its square 
> is I.
>
>     I succeed in having the coordinates, but maybe not in the right
>     order. I reused DMPlexWriteCoordinates_HDF5_Static, which create a
>     Vec of vertex coordinates, but i couldn't understand the order of
>     vertex coordinate linked (or not) with
>     DMPlexWriteTopology_Vertices_HDF5_Static. The number of local
>     coordinates is also strange, and does not correspond to the cell's
>     topology.
>
>
> It matches the order of the global numbering.
>
>   Thanks
>
>     Matt
>
>     Thanks for the help!
>
>     Regards,
>
>     Yann
>
>
>>     I usually put
>>
>>       DMViewFromOptions(dm, NULL, "-dm_view")
>>
>>     Then
>>
>>       -dm_view hdf5:mesh.h5
>>
>>       Thanks,
>>
>>         Matt
>>
>>         However for parallel ones, we need to get the node numbering
>>         in the global ordering, as when we distribute the mesh, we
>>         only have local nodes, and thus local numbering.
>>
>>         It seems that we should use DMGetLocalToGlobalMapping (we are
>>         using Fortran with Petsc 3.8p3). However, we get the running
>>         error :
>>
>>         [0]PETSC ERROR: No support for this operation for this object
>>         type
>>         [0]PETSC ERROR: DM can not create LocalToGlobalMapping
>>
>>         Is it the right way to do it ?
>>
>>         Many thanks,
>>
>>         Regards,
>>
>>         Yann
>>
>>
>>
>>
>>     -- 
>>     What most experimenters take for granted before they begin their
>>     experiments is infinitely more interesting than any results to
>>     which their experiments lead.
>>     -- Norbert Wiener
>>
>>     https://www.cse.buffalo.edu/~knepley/
>>     <http://www.caam.rice.edu/%7Emk51/>
>
>
>
>
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/%7Emk51/>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20171220/430df0b2/attachment.html>
-------------- next part --------------
6-----7-----8
|     |     |
|  2  |  3  |
|     |     |
3-----4-----5
|     |     |
|  0  |  1  |
|     |     |
0-----1-----2

Coordinates :                          CellIS
rank 0  vertice 0 : 0. 0.              IS Object: 1 MPI processes  
rank 0  vertice 1 : 0.5 0.	         type: general
rank 0  vertice 2 : 1. 0.	       Number of indices in set 16
rank 0  vertice 3 : 0. 0.5	       0 0
rank 0  vertice 4 : 0.5 0.5	       1 1
rank 0  vertice 5 : 1. 0.5	       2 4
rank 0  vertice 6 : 0. 1.	       3 3
rank 0  vertice 7 : 0.5 1.	       4 1
rank 0  vertice 8 : 1. 1.	       5 2
					               6 5
				                   7 4
				                   8 3
				                   9 4
				                   10 7
				                   11 6
				                   12 4
				                   13 5
				                   14 8
				                   15 7
Cells Global numbering:
IS Object: 1 MPI processes
  type: general
Number of indices in set 4
0 0
1 1
2 2
3 3


0-----1-----2
|     |     |
|  0  |  1  |
|     |     |
6-----7-----8
|     |     |
|  2  |  3  |
|     |     |
3-----4-----5

Coordinates :                          CellIS
rank 0  vertice 0 : 0. 1.              IS Object: 2 MPI processes     
rank 0  vertice 1 : 0.5 1.	         type: general
rank 0  vertice 2 : 1. 1.	       [0] Number of indices in set 8
rank 1  vertice 3 : 0. 0.	       [0] 0 6
rank 1  vertice 4 : 0.5 0.	       [0] 1 7
rank 1  vertice 5 : 1. 0.	       [0] 2 1
rank 1  vertice 6 : 0. 0.5	       [0] 3 0
rank 1  vertice 7 : 0.5 0.5	       [0] 4 7
rank 1  vertice 8 : 1. 0.5	       [0] 5 8
				                   [0] 6 2
				                   [0] 7 1
				                   [1] Number of indices in set 8
				                   [1] 0 3
				                   [1] 1 4
				                   [1] 2 7
				                   [1] 3 6
				                   [1] 4 4
				                   [1] 5 5
				                   [1] 6 8
				                   [1] 7 7


Cells Global numbering:
IS Object: 2 MPI processes
  type: general
[0] Number of indices in set 2
[0] 0 0
[0] 1 1
[1] Number of indices in set 2
[1] 0 2
[1] 1 3



0-----1-----2
|     |     |
|  0  |  1  |
|     |     |
7-----8-----4
|     |     |
|  3  |  2  |
|     |     |
5-----6-----3

Coordinates :                          CellIS
rank 0  vertice 0 : 0. 1.              IS Object: 4 MPI processes
rank 1  vertice 1 : 0.5 1.	        type: general
rank 1  vertice 2 : 1. 1.	      [0] Number of indices in set 4
rank 2  vertice 3 : 1. 0.	      [0] 0 7
rank 2  vertice 4 : 1. 0.5	      [0] 1 8
rank 3  vertice 5 : 0. 0.	      [0] 2 1
rank 3  vertice 6 : 0.5 0.	      [0] 3 0
rank 3  vertice 7 : 0. 0.5	      [1] Number of indices in set 4
rank 3  vertice 8 : 0.5 0.5	      [1] 0 8
				                  [1] 1 4
				                  [1] 2 2
				                  [1] 3 1
				                  [2] Number of indices in set 4
				                  [2] 0 6
				                  [2] 1 3
				                  [2] 2 4
				                  [2] 3 8
				                  [3] Number of indices in set 4
				                  [3] 0 5
				                  [3] 1 6
				                  [3] 2 8
				                  [3] 3 7

Cells Global numbering:
IS Object: 4 MPI processes
  type: general
[0] Number of indices in set 1
[0] 0 0
[1] Number of indices in set 1
[1] 0 1
[2] Number of indices in set 1
[2] 0 2
[3] Number of indices in set 1
[3] 0 3


More information about the petsc-users mailing list