[petsc-users] dmplex extruded layers

Matthew Knepley knepley at gmail.com
Fri Dec 3 13:32:26 CST 2021


On Wed, Nov 24, 2021 at 5:45 PM Bhargav Subramanya <
bhargav.subramanya at kaust.edu.sa> wrote:

> On Wed, Nov 24, 2021 at 8:59 PM Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Wed, Nov 24, 2021 at 12:27 PM Bhargav Subramanya <
>> bhargav.subramanya at kaust.edu.sa> wrote:
>>
>>> Dear Matt and Mark,
>>>
>>> Thank you very much for your reply. Your inputs are very useful to me.
>>>
>>> On Mon, Nov 22, 2021 at 9:38 PM Matthew Knepley <knepley at gmail.com>
>>> wrote:
>>>
>>>> On Mon, Nov 22, 2021 at 12:10 PM Bhargav Subramanya <
>>>> bhargav.subramanya at kaust.edu.sa> wrote:
>>>>
>>>>> Dear All,
>>>>>
>>>>> I have a prismatic mesh generated by extruding a base icosahedron
>>>>> spherical surface mesh. The mesh is distributed on 2 processes after
>>>>> extrusion. The dof for my problem are defined only on the simplices of the
>>>>> spherical layers.
>>>>>
>>>>> 1. For a given spherical layer, I want to get points, which are
>>>>> basically simplices, lying on that layer only. DMPlexGetHeightStratum
>>>>> returns the points on all the spherical Layers. I can probably use
>>>>> DMPlexFindVertices (based on the radius of the spherical layer) and
>>>>> DMPlexGetSimplexOrBoxCells. Could you suggest if there is a better way to
>>>>> retrieve the points (which are simplices) on a given spherical layer in the
>>>>> extruded mesh?
>>>>>
>>>>
>>>> DMPlexGetHeightStratum() refers to height in the Hasse Diagram, which
>>>> is a DAG, not in the mesh. For example, height 0 points are the cells,
>>>> height 1 are the faces, etc.
>>>>
>>>> I believe the default numbering for extrusion (in the main branch), is
>>>> that all vertices produced from a given vertex be in order. This would mean
>>>> that if v were the vertex point number, then
>>>>
>>>>   (v - vStart) % Nl == l
>>>>
>>>> where Nl is the number of layers and l is the layer of that vertex. It
>>>> is also the same for triangles. So if you want to segregate each shell,
>>>> after extrusion, make a label that gives triangles this marker, and then
>>>> use DMPlexLabelComplete(). Then after refinement you will still have the
>>>> shells labeled correctly.
>>>>
>>>> I would be happy to help you make an example that does this. It seems
>>>> cool.
>>>>
>>>
>>> ----- Sure, I am interested to make an example of it. And, I really
>>> appreciate your help.
>>>
>>>>
>>>>
>>>>> 2. I am able to visualize the entire mesh using dm_view. Is there a
>>>>> way to get the mesh file for the local dm from a specific process?
>>>>>
>>>>
>>>> You can use -dm_partition_view which outputs a field with the process
>>>> number. Then use Clip inside Paraview and clip to the process number you
>>>> want,
>>>> or just view that field so each process has a different color.
>>>>
>>>>
>>>>> 3. One of my previous emails might have got lost as I mistakenly
>>>>> attached a large mesh file and sent it. So I am repeating the question
>>>>> here. DMPlexExtrude gives the following error after distributing the base
>>>>> 2D spherical mesh. Both refinement or/and extrusion after dm distribution
>>>>> does not work for me. In fact, I tried
>>>>> with src/dm/impls/plex/tutorials/ex10.c also. Although the mesh is
>>>>> distributed after the extrusion in ex10.c (which is working fine for me), I
>>>>> tried to distribute before extrusion, and I get the following error. Could
>>>>> you please suggest where I might be making a mistake?
>>>>>
>>>>
>>>> So you want to distribute the mesh before extruding. For that small
>>>> example (using the main branch), I run
>>>>
>>>>   mpiexec -n 3 ./meshtest -dm_plex_shape sphere -dm_refine_pre 2
>>>> -dm_distribute -dm_partition_view -dm_view hdf5:mesh.h5 -dm_extrude 3
>>>>
>>>
>>> ----- I pulled the code from the main branch and configured my petsc
>>> again. I am now finally able to reproduce the mesh that you have attached.
>>> I am able to do parallel mesh refinement, followed by extrusion, and use
>>> the following code to check that. However, I still have one problem.
>>> Attached are the extruded mesh files.
>>>
>>> 1. Mesh doesn't seem to be refined properly near interfaces between the
>>> processes. I then used -init_dm_distribute_overlap of 1 and that fixed the
>>> problem. I hope this is the way to do that.
>>>
>>
>> Hmm, something is wrong here. We should track this down and fix it. I do
>> not see this. Could you tell me how to reproduce the problem?
>>
>
> ----- I mainly wanted to check if the mesh was distributed or not before
> the extrusion. I do this by specifying different prefix options as shown in
> the code below. I think using different prefix options is probably causing
> the problem.
>
>  #include <petsc.h>
>
> int main(int argc, char **argv)
> {
> DM             dm;
> PetscBool distributed;
> PetscErrorCode ierr;
>
>
> ierr = PetscInitialize(&argc, &argv, NULL, NULL);if (ierr) return ierr;
> ierr = DMCreate(PETSC_COMM_WORLD, &dm);CHKERRQ(ierr);
> ierr = DMSetType(dm, DMPLEX);CHKERRQ(ierr);
> ierr = PetscObjectSetOptionsPrefix((PetscObject) dm,
> "init_");CHKERRQ(ierr);
> ierr = DMSetFromOptions(dm);CHKERRQ(ierr);
>
> ierr = DMPlexIsDistributed(dm, &distributed);
> ierr = PetscSynchronizedPrintf(PETSC_COMM_WORLD, "%d \n", distributed);
> CHKERRQ(ierr);
> ierr = PetscSynchronizedFlush(PETSC_COMM_WORLD,PETSC_STDOUT);
>
> ierr = PetscObjectSetOptionsPrefix((PetscObject) dm, NULL);CHKERRQ(ierr);
> ierr = DMSetFromOptions(dm);CHKERRQ(ierr);
> ierr = DMViewFromOptions(dm, NULL, "-dm_view");CHKERRQ(ierr);
> ierr = DMDestroy(&dm);CHKERRQ(ierr);
> ierr = PetscFinalize();
> return ierr;
> }
>
> The command line options are:  mpiexec -n 3 ./cavity_flow.out
> -init_dm_plex_shape sphere -init_dm_plex_sphere_radius 1.0
> -init_dm_refine_pre 2 -init_dm_distribute -dm_refine 2 -dm_extrude 3
> -dm_partition_view -dm_view hdf5:mesh.h5
>

Okay, I run exactly this code and command on 'main', and it does work for
me:

master *:~/Downloads/tmp/hpc1$ /PETSc3/petsc/apple/bin/mpiexec -n 3
./meshtest -init_dm_plex_shape sphere -init_dm_plex_sphere_radius 1.0
-init_dm_refine_pre 2 -init_dm_distribute -init_dm_view -dm_refine 2
-dm_extrude 3 -dm_partition_view_no -dm_view -malloc_debug 0
DM Object: sphere (init_) 3 MPI processes
  type: plex
sphere in 2 dimensions:
  0-cells: 66 66 67
  1-cells: 171 173 172
  2-cells: 106 108 106
Labels:
  depth: 3 strata with value/size (0 (66), 1 (171), 2 (106))
  celltype: 3 strata with value/size (0 (66), 1 (171), 3 (106))
1
1
1
DM Object: sphere 3 MPI processes
  type: plex
sphere in 3 dimensions:
  0-cells: 3588 3636 3604
  1-cells: 13059 (2691) 13271 (2727) 13087 (2703)
  2-cells: 14560 (7776) 14820 (7908) 14572 (7788)
  3-cells: 5088 5184 5088
Labels:
  celltype: 6 strata with value/size (5 (7776), 9 (5088), 0 (3588), 1
(10368), 3 (6784), 2 (2691))
  depth: 4 strata with value/size (0 (3588), 1 (13059), 2 (14560), 3 (5088))
WARNING! There are options you set that were not used!
WARNING! could be spelling mistake, etc!
There is one unused database option. It is:
Option left: name:-dm_partition_view_no (no value)


> 2. The parallel mesh refinement doesn't seem to take care of the spherical
>>> geometry (shown in the attached mesh file). Could you please suggest how to
>>> fix it?
>>>
>>
>> Okay, this is more subtle. Let me explain how things work inside.
>>
>> 1) When you refine, it inserts new cells/faces/edges/vertices. When we
>> insert a new vertex, we ask what the coordinates should be. If no special
>> information is there, we
>>      just average the surrounding vertices.
>>
>> 2) Then we allow a second step that processed all coordinates in the mesh
>> after refinement
>>
>> For the sphere, we use 2) to make all points stick to the implicit
>> surface (radius R). We cannot do this for the extruded mesh, so we turn it
>> off.
>>
>
> ---- thanks for this explanation.
>
>
>> What makes the most sense to me is to do
>>
>
>> 1) Serial mesh refinement to get enough cells to distribute evenly
>>
>> 2) Mesh distribution
>>
>> 3) Mesh refinement
>>
>> 4) Extrusion
>>
>> This should preserve the geometry and scale well. Does that sound right?
>>
>
> -- Yes, this sounds right, and this must preserve the geometry. In fact, I
> was under the assumption that I was doing the exact same thing by using
> these options:
> mpiexec -n 3 ./cavity_flow.out -init_dm_plex_shape sphere
> -init_dm_plex_sphere_radius 1.0 -init_dm_refine_pre 2 -init_dm_distribute
> -dm_refine 2 -dm_extrude 3 -dm_partition_view -dm_view hdf5:mesh.h5
> Do you think these options are not doing the things in the order of 1,2,3
> and 4 as you specified?  If not, could you suggest the options that I need
> to use to get in the order of 1,2,3 and 4?
>

Yes, I am getting the surface of the sphere with this.

  Thanks,

    Matt


>
>>   Thanks,
>>
>>      Matt
>>
>>
>>> command line options:
>>> -init_dm_plex_shape sphere -init_dm_plex_sphere_radius 1.0
>>> -init_dm_refine_pre 2 -init_dm_distribute -dm_refine 2 -dm_partition_view
>>> -dm_view hdf5:mesh.h5 -dm_extrude 3
>>>
>>>  ierr = DMCreate(PETSC_COMM_WORLD, &dm);CHKERRQ(ierr);
>>> ierr = DMSetType(dm, DMPLEX);CHKERRQ(ierr);
>>> ierr = PetscObjectSetOptionsPrefix((PetscObject) dm,
>>> "init_");CHKERRQ(ierr);
>>> ierr = DMSetFromOptions(dm);CHKERRQ(ierr);
>>>
>>> ierr = DMPlexIsDistributed(dm, &distributed);
>>> ierr = PetscSynchronizedPrintf(PETSC_COMM_WORLD, "%d \n", distributed);
>>> CHKERRQ(ierr);
>>> ierr = PetscSynchronizedFlush(PETSC_COMM_WORLD,PETSC_STDOUT);
>>>
>>> ierr = PetscObjectSetOptionsPrefix((PetscObject) dm, NULL);CHKERRQ(ierr);
>>> ierr = DMSetFromOptions(dm);CHKERRQ(ierr);
>>> ierr = DMViewFromOptions(dm, NULL, "-dm_view");CHKERRQ(ierr);
>>>
>>>
>>>> and I get the attached picture.
>>>>
>>>>   Thanks,
>>>>
>>>>     Matt
>>>>
>>>>
>>>>> [0]PETSC ERROR: --------------------- Error Message
>>>>> --------------------------------------------------------------
>>>>> [0]PETSC ERROR: Object is in wrong state
>>>>> [0]PETSC ERROR: This DMPlex is distributed but its PointSF has no
>>>>> graph set
>>>>> [0]PETSC ERROR: See https://petsc.org/release/faq/
>>>>> <https://urldefense.com/v3/__https://petsc.org/release/faq/__;!!Nmw4Hv0!hb2EaePjZqAA4Ns2bFzuHqtjvdZrySmfJcO-7YL5Twmqdh_n0Ym4CneeBI7x1Z5UiaOW2dk_COg$>
>>>>> for trouble shooting.
>>>>> [0]PETSC ERROR: Petsc Release Version 3.16.1, unknown
>>>>> [0]PETSC ERROR: ./cavity_flow.out on a arch-darwin-c-debug named
>>>>> kl-21859 by subrambm Mon Nov 22 19:47:14 2021
>>>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
>>>>> --with-fc=gfortran --download-mpich --download-fblaslapack
>>>>> --download-superlu_dist --download-hypre --download-fiat
>>>>> --download-generator --download-triangle --download-tetgen --download-chaco
>>>>> --download-make -download-boost --download-cmake --download-ml
>>>>> --download-mumps=https://bitbucket.org/petsc/pkg-mumps.git
>>>>> <https://urldefense.com/v3/__https://bitbucket.org/petsc/pkg-mumps.git__;!!Nmw4Hv0!hb2EaePjZqAA4Ns2bFzuHqtjvdZrySmfJcO-7YL5Twmqdh_n0Ym4CneeBI7x1Z5UiaOW414fjd0$>
>>>>> --download-mumps-commit=v5.4.1-p1 --download-scalapack --download-ptscotch
>>>>> --download-hdf5 --force
>>>>> [0]PETSC ERROR: #1 DMPlexCheckPointSF() at
>>>>> /Users/subrambm/petsc/src/dm/impls/plex/plex.c:8554
>>>>> [0]PETSC ERROR: #2 DMPlexOrientInterface_Internal() at
>>>>> /Users/subrambm/petsc/src/dm/impls/plex/plexinterpolate.c:595
>>>>> [0]PETSC ERROR: #3 DMPlexInterpolate() at
>>>>> /Users/subrambm/petsc/src/dm/impls/plex/plexinterpolate.c:1357
>>>>> [0]PETSC ERROR: #4 DMPlexExtrude() at
>>>>> /Users/subrambm/petsc/src/dm/impls/plex/plexcreate.c:1543
>>>>> [0]PETSC ERROR: #5 CreateMesh() at ex10.c:161
>>>>> [0]PETSC ERROR: #6 main() at ex10.c:180
>>>>> [0]PETSC ERROR: PETSc Option Table entries:
>>>>> [0]PETSC ERROR: -dm_plex_extrude_layers 3
>>>>> [0]PETSC ERROR: -dm_view vtk:mesh.vtk
>>>>> [0]PETSC ERROR: -init_dm_plex_dim 2
>>>>> [0]PETSC ERROR: -petscpartitioner_type simple
>>>>> [0]PETSC ERROR: -srf_dm_distribute
>>>>> [0]PETSC ERROR: -srf_dm_refine 0
>>>>> [0]PETSC ERROR: ----------------End of Error Message -------send
>>>>> entire error message to petsc-maint at mcs.anl.gov----------
>>>>>
>>>>> Thanks,
>>>>> Bhargav
>>>>>
>>>>>
>>>>> ------------------------------
>>>>> This message and its contents, including attachments are intended
>>>>> solely for the original recipient. If you are not the intended recipient or
>>>>> have received this message in error, please notify me immediately and
>>>>> delete this message from your computer system. Any unauthorized use or
>>>>> distribution is prohibited. Please consider the environment before printing
>>>>> this email.
>>>>
>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>> https://www.cse.buffalo.edu/~knepley/
>>>> <https://urldefense.com/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!Nmw4Hv0!hb2EaePjZqAA4Ns2bFzuHqtjvdZrySmfJcO-7YL5Twmqdh_n0Ym4CneeBI7x1Z5UiaOWdEg9at8$>
>>>>
>>>
>>> ------------------------------
>>> This message and its contents, including attachments are intended solely
>>> for the original recipient. If you are not the intended recipient or have
>>> received this message in error, please notify me immediately and delete
>>> this message from your computer system. Any unauthorized use or
>>> distribution is prohibited. Please consider the environment before printing
>>> this email.
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <https://urldefense.com/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!Nmw4Hv0!jtcvFDJFDmYH37N1wm6apnWSZeA9Tu6jsXTTbAqZ7w6gvNxKzNSfHoeVDIUfLDjsDO1vgciyMAw$>
>>
>
> ------------------------------
> This message and its contents, including attachments are intended solely
> for the original recipient. If you are not the intended recipient or have
> received this message in error, please notify me immediately and delete
> this message from your computer system. Any unauthorized use or
> distribution is prohibited. Please consider the environment before printing
> this email.



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20211203/4b68f95d/attachment-0001.html>


More information about the petsc-users mailing list