[petsc-users] dmplex extrude in parallel

Matthew Knepley knepley at gmail.com
Wed Nov 10 07:33:27 CST 2021


On Wed, Nov 10, 2021 at 8:26 AM Bhargav Subramanya <
bhargav.subramanya at kaust.edu.sa> wrote:

> Dear Matt,
>
> Thanks a lot for the reply. I am now able to generate the prismatic mesh
> properly.
>

Cool.


> In the case of mpiexec -n 2 ./meshtest -dm_plex_shape sphere
> -dm_refine_pre 3 -dm_distribute -dm_refine 2 and DMExtrude(), where I am
> ordering the extruded cells on the layers first; Is the mesh extruded in
> parallel for this case?
>

Yes.


> The first time I distributed the mesh, it was only the spherical surface
> mesh. After dmplex extrude, the mesh changes. Do I need to redistribute the
> mesh again?
>

Probably not. The balance factor will not change, meaning the relative
sizes of the partitions. However, if you have a lot of layers, maybe you
want to. It is easy,
just call DMDistribute() again.

  Thanks,

     Matt


> Thanks,
> Bhargav
>
> On Wed, Nov 10, 2021 at 12:56 AM Matthew Knepley <knepley at gmail.com>
> wrote:
>
>> On Tue, Nov 9, 2021 at 9:54 AM Bhargav Subramanya <
>> bhargav.subramanya at kaust.edu.sa> wrote:
>>
>>> Dear All,
>>>
>>> I want to generate a prismatic mesh from a base spherical surface mesh.
>>> I first refine the base icosahedron sequential mesh to some extent,
>>> distribute the mesh, continue the refinement in parallel, project the mesh
>>> to a unit sphere, and then finally extrude the mesh. The following is the
>>> code I am using. As the extrusion is performed in parallel, the extruded
>>> mesh seems to be broken as shown in the attached figure. May I know how to
>>> get an intact extruded mesh in parallel? Also, is it possible to make mesh
>>> refinement respect the spherical surface geometry, without having to
>>> project it using a function as shown below?
>>>
>>
>> I think you can do most of what you want from the command line. Here is a
>> simple code:
>>
>> #include <petsc.h>
>>
>> int main(int argc, char **argv)
>> {
>>   DM             dm;
>>   PetscErrorCode ierr;
>>
>>   ierr = PetscInitialize(&argc, &argv, NULL, NULL);if (ierr) return ierr;
>>   ierr = DMCreate(PETSC_COMM_WORLD, &dm);CHKERRQ(ierr);
>>   ierr = DMSetType(dm, DMPLEX);CHKERRQ(ierr);
>>   ierr = DMSetFromOptions(dm);CHKERRQ(ierr);
>>   ierr = DMViewFromOptions(dm, NULL, "-dm_view");CHKERRQ(ierr);
>>   ierr = DMDestroy(&dm);CHKERRQ(ierr);
>>   ierr = PetscFinalize();
>>   return ierr;
>> }
>>
>> that I run using
>>
>>   mpiexec -n 2 ./meshtest -dm_plex_shape sphere -dm_refine_pre 3
>> -dm_extrude 5 -dm_plex_transform_extrude_thickness 0.1 -dm_distribute
>> -dm_view hdf5:mesh.h5
>>
>> and get the attached picture. I am refining before distribution only
>> here, since the after distribution refinement would refine the extrude
>> thing, which is not what you want.
>> If you want refinement after distribution, you could use this code to do
>>
>>   mpiexec -n 2 ./meshtest -dm_plex_shape sphere -dm_refine_pre
>> 3 -dm_distribute -dm_refine 2
>>
>> and then call DMExtrude() by hand after that. Alternatively, I could make
>> a pre-extrusion refinement option.
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>>> Thanks,
>>> Bhargav
>>>
>>> /* refine the sequential mesh first */
>>> for (int r = 0; r < numRefinePre; ++r) {
>>> DM rdm = NULL;
>>> DMRefine(dm, MPI_COMM_WORLD, &rdm);
>>> DMDestroy(&dm);
>>> dm = rdm;
>>>  }
>>>
>>> /* project to a unit sphere */
>>> ierr = ProjectToUnitSphere(dm); CHKERRQ(ierr);
>>>
>>> /* create and distribute DM */
>>> ierr = DMPlexDistribute(dm, 0, NULL, &dmDist);CHKERRQ(ierr);
>>> if (dmDist) {
>>> DMDestroy(&dm);
>>> dm   = dmDist;
>>> }
>>>
>>> /* refine the mesh in parallel */
>>> for (int r = 0; r < numRefine; ++r) {
>>> DM rdm = NULL;
>>> DMRefine(dm, MPI_COMM_WORLD, &rdm);
>>> DMDestroy(&dm);
>>> dm = rdm;
>>>  }
>>>
>>> /* project to a unit sphere */
>>> ierr = ProjectToUnitSphere(dm); CHKERRQ(ierr);
>>>
>>> ierr = DMPlexExtrude(dm, numLayers-1, radialThickness, PETSC_FALSE,
>>> NULL, PETSC_TRUE, &dm3D);CHKERRQ(ierr);
>>>
>>> if (dm3D) {
>>> DMDestroy(&dm);
>>> dm = dm3D;
>>> }
>>>
>>>
>>> ------------------------------
>>> This message and its contents, including attachments are intended solely
>>> for the original recipient. If you are not the intended recipient or have
>>> received this message in error, please notify me immediately and delete
>>> this message from your computer system. Any unauthorized use or
>>> distribution is prohibited. Please consider the environment before printing
>>> this email.
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <https://urldefense.com/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!Nmw4Hv0!lx1-mURFRmaF1Jo4rDYcZmLsnPsG8EO1HFWoGa2ukYZRt1fF4JLOpS3gHpaUHzL6_KersFjJlmA$>
>>
>
> ------------------------------
> This message and its contents, including attachments are intended solely
> for the original recipient. If you are not the intended recipient or have
> received this message in error, please notify me immediately and delete
> this message from your computer system. Any unauthorized use or
> distribution is prohibited. Please consider the environment before printing
> this email.



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20211110/1d606571/attachment.html>


More information about the petsc-users mailing list