[petsc-users] What does DMPlexDistribute actually do?
Ellen M. Price
ellen.price at cfa.harvard.edu
Sun Mar 25 17:32:26 CDT 2018
I am trying to understand some unusual behavior (at least, given my
current understanding) in DMPlexDistribute. I have created a hex mesh
and distributed it using the following snippet:
/* "-refine" is an option set at runtime */
PetscOptionsGetInt(NULL, NULL, "-refine", &refine, NULL);
ierr = DMPlexCreateHexCylinderMesh(PETSC_COMM_WORLD, refine,
DM_BOUNDARY_NONE, &dmplex); CHKERRQ(ierr);
DMView(dmplex, PETSC_VIEWER_STDOUT_WORLD);
/* this is from the examples */
ierr = DMPlexDistribute(dmplex, 0, NULL, &dmplex_dist); CHKERRQ(ierr);
if (dmplex_dist)
{
DMDestroy(&dmplex);
dmplex = dmplex_dist;
}
DMView(dmplex, PETSC_VIEWER_STDOUT_WORLD);
So I do a view operation before and after the distribute call to see how
the DM is structured. I do not understand what happens next:
$ mpirun -n 4 ./myprogram -refine 2
DM Object: 4 MPI processes
type: plex
DM_0x1f24d50_0 in 3 dimensions:
0-cells: 445 445 445 445
1-cells: 1196 1196 1196 1196
2-cells: 1072 1072 1072 1072
3-cells: 320 320 320 320
Labels:
depth: 4 strata with value/size (0 (445), 1 (1196), 2 (1072), 3 (320))
DM Object: Parallel Mesh 4 MPI processes
type: plex
Parallel Mesh in 3 dimensions:
0-cells: 445 445 445 445
1-cells: 1196 1196 1196 1196
2-cells: 1072 1072 1072 1072
3-cells: 320 320 320 320
Labels:
depth: 4 strata with value/size (0 (445), 1 (1196), 2 (1072), 3 (320))
No matter what I choose for the number of processors, every processor
has a copy of all 320 cells (at this refinement level). Similar behavior
at other refinement levels. The only thing that is changed is the name
of the DM, to "Parallel Mesh". This is not what I would have expected
given the description of DMPlexDistribute in the manual; I thought the
cells would be split up between all available processors.
I am also viewing this mesh in VTK and have noticed that the file size
of the output scales with the number of processors, as if it is really
writing each "copy" of the mesh and data stored in it to one big file.
Again, not what I expected.
Can someone clear up what is going on?
Ellen Price
More information about the petsc-users
mailing list