[petsc-users] dmplex extrude in parallel

Matthew Knepley knepley at gmail.com
Thu Nov 11 12:41:27 CST 2021


On Thu, Nov 11, 2021 at 1:17 PM Bhargav Subramanya <
bhargav.subramanya at kaust.edu.sa> wrote:

> Dear Matt,
>
> I just realized that I used PETSC_COMM_SELF instead of PETSC_COMM_WORLD
> while performing the above test. I am sorry for that. After fixing it, I
> have the following 3 cases:
>
> 1. With the below command line options, where I simply create the
> spherical surface mesh, everything is fine. The mesh gets distributed in
> parallel with the check DMPlexIsDistributed.
> /home/subrambm/petsc/arch-linux-c-debug/bin/mpiexec -n 2 ./cavity_flow.out
> -dm_plex_dim 2 -dm_plex_simplex 1 -dm_plex_shape sphere
> -dm_plex_sphere_radius 1.0 -dm_refine_pre 2 -dm_view vtk:mesh.vtk
> -dm_distribute 1
>
> -------------------------------------------------------------------
> 2. This case also seems to work. However, the mesh file data is bad. Do
> you think it's because of the vtk file format?
> -dm_plex_shape sphere -dm_plex_dim 2 -dm_plex_simplex 1
> -dm_plex_sphere_radius 1.0 -dm_refine_pre 3 -dm_extrude_layers 4
> -dm_extrude_thickness 0.1 -dm_extrude_column_first 0 -dm_distribute 1
> -dm_view vtk:mesh.vtk
>

It should work, but that would be the first place I would check for a bug.
I would use

  -dm_view hdf5:mesh.h5

and then

  $PETSC_DIR/lib/petsc/bin/petsc_gen_xdmf.py mesh.h5

to make mesh.xmf, which can be loaded by ParaView.


> -------------------------------------------------------------------
>
> 3. This is the case which I want: To distribute the mesh, refine it in
> parallel and extrude it.
> However, after distribution, the mesh does not seem to be refined as I get
> the following error:
> -dm_plex_shape sphere -dm_plex_dim 2 -dm_plex_simplex 1
> -dm_plex_sphere_radius 1.0 -dm_refine_pre 3 -dm_distribute 1 -dm_refine 1
>

The problem here is that you are refining the extruded mesh, not refining
the surface mesh. There must be something switched around in your file.

  Thanks,

     Matt


> Ignoring PCI device with non-16bit domain.
> Pass --enable-32bits-pci-domain to configure to support such devices
> (warning: it would break the library ABI, don't enable unless really
> needed).
> Ignoring PCI device with non-16bit domain.
> Pass --enable-32bits-pci-domain to configure to support such devices
> (warning: it would break the library ABI, don't enable unless really
> needed).
> [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [1]PETSC ERROR: Nonconforming object sizes
> [1]PETSC ERROR: The output section point (0) closure size 9 != dual space
> dimension 12 at height 0 in [0, 0]
> [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> [1]PETSC ERROR: Petsc Release Version 3.16.1, unknown
> [1]PETSC ERROR: ./cavity_flow.out on a arch-linux-c-debug named kw60970 by
> subrambm Thu Nov 11 21:13:27 2021
> [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --download-mpich --download-fblaslapack
> --download-superlu_dist --download-hypre --download-fiat
> --download-generator --download-triangle --download-tetgen --download-chaco
> --download-make -download-boost --download-cmake --download-ml
> --download-mumps --download-scalapack --download-parmetis --download-metis
> --download-ptscotch --download-hdf5
> [1]PETSC ERROR: #1 DMProjectLocal_Generic_Plex() at
> /home/subrambm/petsc/src/dm/impls/plex/plexproject.c:762
> [1]PETSC ERROR: #2 DMProjectFieldLocal_Plex() at
> /home/subrambm/petsc/src/dm/impls/plex/plexproject.c:933
> [1]PETSC ERROR: #3 DMProjectFieldLocal() at
> /home/subrambm/petsc/src/dm/interface/dm.c:8863
> [1]PETSC ERROR: #4 DMPlexRemapGeometry() at
> /home/subrambm/petsc/src/dm/impls/plex/plexgeometry.c:3270
> [1]PETSC ERROR: #5 DMSetFromOptions_Plex() at
> /home/subrambm/petsc/src/dm/impls/plex/plexcreate.c:3064
> [1]PETSC ERROR: #6 DMSetFromOptions() at
> /home/subrambm/petsc/src/dm/interface/dm.c:902
> [1]PETSC ERROR: #7 main() at meshtest.c:12
> [1]PETSC ERROR: PETSc Option Table entries:
> [1]PETSC ERROR: -dm_distribute 1
> [1]PETSC ERROR: -dm_plex_dim 2
> [1]PETSC ERROR: -dm_plex_shape sphere
> [1]PETSC ERROR: -dm_plex_simplex 1
> [1]PETSC ERROR: -dm_plex_sphere_radius 1.0
> [1]PETSC ERROR: -dm_refine 1
> [1]PETSC ERROR: -dm_refine_pre 3
> [1]PETSC ERROR: ----------------End of Error Message -------send entire
> error message to petsc-maint at mcs.anl.gov----------
> application called MPI_Abort(MPI_COMM_WORLD, 60) - process 1
>
> Thanks,
> Bhargav
>
> On Wed, Nov 10, 2021 at 4:33 PM Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Wed, Nov 10, 2021 at 8:26 AM Bhargav Subramanya <
>> bhargav.subramanya at kaust.edu.sa> wrote:
>>
>>> Dear Matt,
>>>
>>> Thanks a lot for the reply. I am now able to generate the prismatic mesh
>>> properly.
>>>
>>
>> Cool.
>>
>>
>>> In the case of mpiexec -n 2 ./meshtest -dm_plex_shape sphere
>>> -dm_refine_pre 3 -dm_distribute -dm_refine 2 and DMExtrude(), where I am
>>> ordering the extruded cells on the layers first; Is the mesh extruded in
>>> parallel for this case?
>>>
>>
>> Yes.
>>
>>
>>> The first time I distributed the mesh, it was only the spherical surface
>>> mesh. After dmplex extrude, the mesh changes. Do I need to redistribute the
>>> mesh again?
>>>
>>
>> Probably not. The balance factor will not change, meaning the relative
>> sizes of the partitions. However, if you have a lot of layers, maybe you
>> want to. It is easy,
>> just call DMDistribute() again.
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>>> Thanks,
>>> Bhargav
>>>
>>> On Wed, Nov 10, 2021 at 12:56 AM Matthew Knepley <knepley at gmail.com>
>>> wrote:
>>>
>>>> On Tue, Nov 9, 2021 at 9:54 AM Bhargav Subramanya <
>>>> bhargav.subramanya at kaust.edu.sa> wrote:
>>>>
>>>>> Dear All,
>>>>>
>>>>> I want to generate a prismatic mesh from a base spherical surface
>>>>> mesh. I first refine the base icosahedron sequential mesh to some extent,
>>>>> distribute the mesh, continue the refinement in parallel, project the mesh
>>>>> to a unit sphere, and then finally extrude the mesh. The following is the
>>>>> code I am using. As the extrusion is performed in parallel, the extruded
>>>>> mesh seems to be broken as shown in the attached figure. May I know how to
>>>>> get an intact extruded mesh in parallel? Also, is it possible to make mesh
>>>>> refinement respect the spherical surface geometry, without having to
>>>>> project it using a function as shown below?
>>>>>
>>>>
>>>> I think you can do most of what you want from the command line. Here is
>>>> a simple code:
>>>>
>>>> #include <petsc.h>
>>>>
>>>> int main(int argc, char **argv)
>>>> {
>>>>   DM             dm;
>>>>   PetscErrorCode ierr;
>>>>
>>>>   ierr = PetscInitialize(&argc, &argv, NULL, NULL);if (ierr) return
>>>> ierr;
>>>>   ierr = DMCreate(PETSC_COMM_WORLD, &dm);CHKERRQ(ierr);
>>>>   ierr = DMSetType(dm, DMPLEX);CHKERRQ(ierr);
>>>>   ierr = DMSetFromOptions(dm);CHKERRQ(ierr);
>>>>   ierr = DMViewFromOptions(dm, NULL, "-dm_view");CHKERRQ(ierr);
>>>>   ierr = DMDestroy(&dm);CHKERRQ(ierr);
>>>>   ierr = PetscFinalize();
>>>>   return ierr;
>>>> }
>>>>
>>>> that I run using
>>>>
>>>>   mpiexec -n 2 ./meshtest -dm_plex_shape sphere -dm_refine_pre 3
>>>> -dm_extrude 5 -dm_plex_transform_extrude_thickness 0.1 -dm_distribute
>>>> -dm_view hdf5:mesh.h5
>>>>
>>>> and get the attached picture. I am refining before distribution only
>>>> here, since the after distribution refinement would refine the extrude
>>>> thing, which is not what you want.
>>>> If you want refinement after distribution, you could use this code to do
>>>>
>>>>   mpiexec -n 2 ./meshtest -dm_plex_shape sphere -dm_refine_pre
>>>> 3 -dm_distribute -dm_refine 2
>>>>
>>>> and then call DMExtrude() by hand after that. Alternatively, I could
>>>> make a pre-extrusion refinement option.
>>>>
>>>>   Thanks,
>>>>
>>>>      Matt
>>>>
>>>>
>>>>> Thanks,
>>>>> Bhargav
>>>>>
>>>>> /* refine the sequential mesh first */
>>>>> for (int r = 0; r < numRefinePre; ++r) {
>>>>> DM rdm = NULL;
>>>>> DMRefine(dm, MPI_COMM_WORLD, &rdm);
>>>>> DMDestroy(&dm);
>>>>> dm = rdm;
>>>>>  }
>>>>>
>>>>> /* project to a unit sphere */
>>>>> ierr = ProjectToUnitSphere(dm); CHKERRQ(ierr);
>>>>>
>>>>> /* create and distribute DM */
>>>>> ierr = DMPlexDistribute(dm, 0, NULL, &dmDist);CHKERRQ(ierr);
>>>>> if (dmDist) {
>>>>> DMDestroy(&dm);
>>>>> dm   = dmDist;
>>>>> }
>>>>>
>>>>> /* refine the mesh in parallel */
>>>>> for (int r = 0; r < numRefine; ++r) {
>>>>> DM rdm = NULL;
>>>>> DMRefine(dm, MPI_COMM_WORLD, &rdm);
>>>>> DMDestroy(&dm);
>>>>> dm = rdm;
>>>>>  }
>>>>>
>>>>> /* project to a unit sphere */
>>>>> ierr = ProjectToUnitSphere(dm); CHKERRQ(ierr);
>>>>>
>>>>> ierr = DMPlexExtrude(dm, numLayers-1, radialThickness, PETSC_FALSE,
>>>>> NULL, PETSC_TRUE, &dm3D);CHKERRQ(ierr);
>>>>>
>>>>> if (dm3D) {
>>>>> DMDestroy(&dm);
>>>>> dm = dm3D;
>>>>> }
>>>>>
>>>>>
>>>>> ------------------------------
>>>>> This message and its contents, including attachments are intended
>>>>> solely for the original recipient. If you are not the intended recipient or
>>>>> have received this message in error, please notify me immediately and
>>>>> delete this message from your computer system. Any unauthorized use or
>>>>> distribution is prohibited. Please consider the environment before printing
>>>>> this email.
>>>>
>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>> https://www.cse.buffalo.edu/~knepley/
>>>> <https://urldefense.com/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!Nmw4Hv0!lx1-mURFRmaF1Jo4rDYcZmLsnPsG8EO1HFWoGa2ukYZRt1fF4JLOpS3gHpaUHzL6_KersFjJlmA$>
>>>>
>>>
>>> ------------------------------
>>> This message and its contents, including attachments are intended solely
>>> for the original recipient. If you are not the intended recipient or have
>>> received this message in error, please notify me immediately and delete
>>> this message from your computer system. Any unauthorized use or
>>> distribution is prohibited. Please consider the environment before printing
>>> this email.
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <https://urldefense.com/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!Nmw4Hv0!i2zXbyfKqyg5D1MzU0Kly__jzbsdtBbBFSsjjn7YY22imNeNSXw_Y4FZN5MzpIX6apvvV7rDhmg$>
>>
>
> ------------------------------
> This message and its contents, including attachments are intended solely
> for the original recipient. If you are not the intended recipient or have
> received this message in error, please notify me immediately and delete
> this message from your computer system. Any unauthorized use or
> distribution is prohibited. Please consider the environment before printing
> this email.



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20211111/0ac8bef8/attachment-0001.html>


More information about the petsc-users mailing list