[petsc-users] What does DMPlexDistribute actually do?

Matthew Knepley knepley at gmail.com
Sun Mar 25 18:29:50 CDT 2018


On Sun, Mar 25, 2018 at 6:32 PM, Ellen M. Price <ellen.price at cfa.harvard.edu
> wrote:

> I am trying to understand some unusual behavior (at least, given my
> current understanding) in DMPlexDistribute. I have created a hex mesh
> and distributed it using the following snippet:
>
> /* "-refine" is an option set at runtime */
> PetscOptionsGetInt(NULL, NULL, "-refine", &refine, NULL);
> ierr = DMPlexCreateHexCylinderMesh(PETSC_COMM_WORLD, refine,
>     DM_BOUNDARY_NONE, &dmplex); CHKERRQ(ierr);
> DMView(dmplex, PETSC_VIEWER_STDOUT_WORLD);
>
> /* this is from the examples */
> ierr = DMPlexDistribute(dmplex, 0, NULL, &dmplex_dist); CHKERRQ(ierr);
>
> if (dmplex_dist)
>
> {
>
>     DMDestroy(&dmplex);
>
>     dmplex = dmplex_dist;
>
> }
>
> DMView(dmplex, PETSC_VIEWER_STDOUT_WORLD);
>
> So I do a view operation before and after the distribute call to see how
> the DM is structured. I do not understand what happens next:
>
> $ mpirun -n 4 ./myprogram -refine 2
>
> DM Object: 4 MPI processes
>   type: plex
> DM_0x1f24d50_0 in 3 dimensions:
>   0-cells: 445 445 445 445
>   1-cells: 1196 1196 1196 1196
>   2-cells: 1072 1072 1072 1072
>   3-cells: 320 320 320 320
> Labels:
>   depth: 4 strata with value/size (0 (445), 1 (1196), 2 (1072), 3 (320))
> DM Object: Parallel Mesh 4 MPI processes
>   type: plex
> Parallel Mesh in 3 dimensions:
>   0-cells: 445 445 445 445
>   1-cells: 1196 1196 1196 1196
>   2-cells: 1072 1072 1072 1072
>   3-cells: 320 320 320 320
> Labels:
>   depth: 4 strata with value/size (0 (445), 1 (1196), 2 (1072), 3 (320))
>
> No matter what I choose for the number of processors, every processor
> has a copy of all 320 cells (at this refinement level). Similar behavior
> at other refinement levels. The only thing that is changed is the name
> of the DM, to "Parallel Mesh". This is not what I would have expected
> given the description of DMPlexDistribute in the manual; I thought the
> cells would be split up between all available processors.
>
> I am also viewing this mesh in VTK and have noticed that the file size
> of the output scales with the number of processors, as if it is really
> writing each "copy" of the mesh and data stored in it to one big file.
> Again, not what I expected.
>
> Can someone clear up what is going on?


It definitely does not work that way. Can you send the whole code you are
running?
Its hard to tell where the problem might be from snippets. You could also
try running
an example. For instance,

  cd $PETSC_DIR/src/snes/examples/tutorials
  make ex13

master *$:/PETSc3/petsc/petsc-dev/src/snes/examples/tutorials$ ./ex13
-cells 8,8 -dm_view

DM Object: Mesh 1 MPI processes

  type: plex

Mesh in 2 dimensions:

  0-cells: 81

  1-cells: 208

  2-cells: 128

Labels:

  Face Sets: 1 strata with value/size (1 (32))

  marker: 1 strata with value/size (1 (64))

  depth: 3 strata with value/size (0 (81), 1 (208), 2 (128))

master *$:/PETSc3/petsc/petsc-dev/src/snes/examples/tutorials$
$PETSC_DIR/arch-master-debug/bin/mpiexec -n 4 ./ex13 -cells 8,8 -dm_view

DM Object: Mesh 4 MPI processes

  type: plex

Mesh in 2 dimensions:

  0-cells: 26 26 27 25

  1-cells: 57 57 58 56

  2-cells: 32 32 32 32

Labels:

  Face Sets: 1 strata with value/size (1 (8))

  marker: 1 strata with value/size (1 (17))

  depth: 3 strata with value/size (0 (26), 1 (57), 2 (32))

master *$:/PETSc3/petsc/petsc-dev/src/snes/examples/tutorials$
$PETSC_DIR/arch-master-debug/bin/mpiexec -n 7 ./ex13 -cells 8,8 -dm_view

DM Object: Mesh 7 MPI processes

  type: plex

Mesh in 2 dimensions:

  0-cells: 17 16 16 17 17 17 19

  1-cells: 34 33 33 35 34 35 35

  2-cells: 18 18 18 19 18 19 18

Labels:

  Face Sets: 1 strata with value/size (1 (5))

  marker: 1 strata with value/size (1 (11))

  depth: 3 strata with value/size (0 (17), 1 (34), 2 (18))

  Thanks,

    Matt


>
> Ellen Price
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180325/221d3a78/attachment.html>


More information about the petsc-users mailing list