[petsc-users] Questions about DMPlex

Edoardo Centofanti edoardo.centofanti01 at universitadipavia.it
Thu Aug 29 05:09:25 CDT 2024


Dear PETSc users,

I am tinkering with DMPlex. I am trying to import a .msh file with some
labelling over it (the mesh is 3D. Some are labels over points within a 3D
volume, others are labels over points lying on a surface for boundary
conditions). However, this is what I get in the preamble of my msh file for
the "PhysicalNames" part:

$PhysicalNames

6

2 4 "surf1"

2 5 "surf2"

2 6 "boundary"

3 1 "vol1"

3 2 "vol2"

3 3 "vol3"
$EndPhysicalNames

I import the mesh through a function which should create a distributed mesh
through all the MPI processes:

PetscErrorCode ImportMsh(const char meshname[], DM *dm) {
    PetscErrorCode ierr;
    DM distributedDM = NULL;

    // Create a DMPlex object and set its type
    ierr = DMCreate(PETSC_COMM_WORLD, dm); CHKERRQ(ierr);
    ierr = DMSetType(*dm, DMPLEX); CHKERRQ(ierr);

    // Import the mesh from an external gmsh file
    ierr = DMPlexCreateGmshFromFile(PETSC_COMM_WORLD, meshname, PETSC_TRUE,
dm); CHKERRQ(ierr);

    // Distribute the mesh across processors
    ierr = DMPlexDistribute(*dm, 0, NULL, &distributedDM); CHKERRQ(ierr);
    if (distributedDM) {
        ierr = DMDestroy(dm); CHKERRQ(ierr);
        *dm = distributedDM;
    }

    // View DMPlex
    DMView(*dm,PETSC_VIEWER_STDOUT_WORLD);
    return ierr;
}

The output of DMView with 1 processor is the following:

DM Object: DM_0x12a623ae0_0 1 MPI process
  type: plex
DM_0x12a623ae0_0 in 3 dimensions:
  Number of 0-cells per rank: 730
  Number of 1-cells per rank: 4010
  Number of 2-cells per rank: 6100
  Number of 3-cells per rank: 2819
Labels:
  celltype: 4 strata with value/size (0 (730), 6 (2819), 3 (6100), 1 (4010))
  depth: 4 strata with value/size (0 (730), 1 (4010), 2 (6100), 3 (2819))
  Cell Sets: 3 strata with value/size (2 (311), 3 (322), 1 (2186))
  Face Sets: 3 strata with value/size (6 (924), 4 (516), 5 (4))

While for 2 processes i get:

DM Object: Parallel Mesh 2 MPI processes
  type: plex
Parallel Mesh in 3 dimensions:
  Number of 0-cells per rank: 625 713
  Number of 1-cells per rank: 2792 3187
  Number of 2-cells per rank: 3546 3759
  Number of 3-cells per rank: 1410 1409
Labels:
  depth: 4 strata with value/size (0 (625), 1 (2792), 2 (3546), 3 (1410))
  celltype: 4 strata with value/size (0 (625), 1 (2792), 3 (3546), 6 (1410))
  Cell Sets: 3 strata with value/size (1 (777), 2 (311), 3 (322))
  Face Sets: 3 strata with value/size (4 (516), 5 (4), 6 (247))

*First question: *Where are the labels I gave in the .msh file? My
interpretation here is that they are the numbers in brackets: for example,
6 (924) means that 924 Face elements are marked with 6, which corresponds
to "boundary" in my .msh file. However, in the 2 processors DMView I get
different number of elements. Again, my intuition is that only proc 0
labels are printed.

*Second question: *Suppose I wanted to fill a matrix only in the entries
corresponding to the nodes of the elements marked with 5, namely "surf2".
How can I do that?

So far, I have used
ierr = DMPlexGetDepthStratum(dm, 2, &pStart, &pEnd); CHKERRQ(ierr);

in order to get pStart and pEnd for the stratum related to faces (2).

Then I looped over p from pStart to pEnd in order to select the faces which
are marked with 5 (I cannot access directly to points since they do not
seem to be marked at all).
For the selected faces, I used
ierr = DMPlexGetConeSize(dm, p, &coneSize1); CHKERRQ(ierr);
ierr = DMPlexGetCone(dm, p, &cone); CHKERRQ(ierr);

in order to retrieve the edges associated to each marked face, then I used
again GetConeSize and GetCone in order to access the vertices and managed
to build an IS object with the points I need (using also ISSortRemoveDups
to remove duplicates). But here I am stuck, since printing this IS gives
the right number of vertices, but with different local numberings depending
on the number of processors used and with a variable offset depending on
the local DAG associated to the local DMPlex.
I wonder if there exists a less cumbersome way to perform this task...

Thank you in advance,
Edoardo
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240829/0bb5f531/attachment.html>


More information about the petsc-users mailing list