<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Sun, Mar 25, 2018 at 7:44 PM, Ellen M. Price <span dir="ltr"><<a href="mailto:ellen.price@cfa.harvard.edu" target="_blank">ellen.price@cfa.harvard.edu</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Here is the full minimal (non)-working example and the problematic<br>
output. I'm using PETSc 3.8.0.<br></blockquote><div><br></div><div>Its not Distribute that is broken, its the cylinder mesh thing. It was written only in serial, so it makes the same</div><div>mesh on every process. Here is an example with a box that works.</div><div><br></div><div>
<span></span>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo">master *$:/PETSc3/petsc/petsc-dev/src/snes/examples/tutorials$ /PETSc3/petsc/petsc-dev/arch-master-debug/bin/mpiexec -n 2 ./tester -refine 5 -petscpartitioner_type simple</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo">DM Object: 2 MPI processes</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo"><span class="gmail-Apple-converted-space"> </span>type: plex</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo">DM_0x84000004_0 in 2 dimensions:</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo"><span class="gmail-Apple-converted-space"> </span>0-cells: 36 0</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo"><span class="gmail-Apple-converted-space"> </span>1-cells: 60 0</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo"><span class="gmail-Apple-converted-space"> </span>2-cells: 25 0</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo">Labels:</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo"><span class="gmail-Apple-converted-space"> </span>depth: 3 strata with value/size (0 (36), 1 (60), 2 (25))</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo"><span class="gmail-Apple-converted-space"> </span>Face Sets: 4 strata with value/size (4 (5), 2 (5), 1 (5), 3 (5))</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo"><span class="gmail-Apple-converted-space"> </span>marker: 1 strata with value/size (1 (40))</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo">DM Object: Parallel Mesh 2 MPI processes</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo"><span class="gmail-Apple-converted-space"> </span>type: plex</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo">Parallel Mesh in 2 dimensions:</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo"><span class="gmail-Apple-converted-space"> </span>0-cells: 22 21</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo"><span class="gmail-Apple-converted-space"> </span>1-cells: 34 32</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo"><span class="gmail-Apple-converted-space"> </span>2-cells: 13 12</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo">Labels:</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo"><span class="gmail-Apple-converted-space"> </span>Face Sets: 3 strata with value/size (1 (5), 2 (2), 4 (3))</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo"><span class="gmail-Apple-converted-space"> </span>marker: 1 strata with value/size (1 (21))</p>
<p class="gmail-p1" style="margin:0px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:13px;line-height:normal;font-family:Menlo"><span class="gmail-Apple-converted-space"> </span>depth: 3 strata with value/size (0 (22), 1 (34), 2 (13))</p>
<br></div><div>I will fix the cylinder mesh for you. Should be in the dev branch this week.</div><div><br></div><div> Thanks for reporting that,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
#include <petscdm.h><br>
<br>
#include <petscdmplex.h><br>
<br>
<br>
int main(int argc, char *argv[])<br>
{<br>
PetscInt refine = 0;<br>
PetscErrorCode ierr;<br>
DM dmplex, dmplex_dist;<br>
<br>
<br>
/* initialize */<br>
PetscInitialize(&argc, &argv, NULL, NULL);<br>
<br>
<br>
/* get the refinement level and create the mesh */<br>
PetscOptionsGetInt(NULL, NULL, "-refine", &refine, NULL);<br>
ierr = DMPlexCreateHexCylinderMesh(<wbr>PETSC_COMM_WORLD, refine,<br>
DM_BOUNDARY_NONE, &dmplex); CHKERRQ(ierr);<br>
DMView(dmplex, PETSC_VIEWER_STDOUT_WORLD);<br>
ierr = DMPlexDistribute(dmplex, 0,<br>
NULL, &dmplex_dist); CHKERRQ(ierr);<br>
if (dmplex_dist)<br>
{<br>
DMDestroy(&dmplex);<br>
dmplex = dmplex_dist;<br>
}<br>
DMView(dmplex, PETSC_VIEWER_STDOUT_WORLD);<br>
DMDestroy(&dmplex);<br>
<br>
/* finalize */<br>
<br>
PetscFinalize();<br>
<br>
return 0;<br>
}<br>
<br>
<br>
$ mpirun -n 2 bin/mwe -refine 2<br>
<br>
DM Object: 2 MPI processes<br>
type: plex<br>
DM_0x1bde850_0 in 3 dimensions:<br>
0-cells: 445 445<br>
1-cells: 1196 1196<br>
2-cells: 1072 1072<br>
3-cells: 320 320<br>
Labels:<br>
depth: 4 strata with value/size (0 (445), 1 (1196), 2 (1072), 3 (320))<br>
DM Object: Parallel Mesh 2 MPI processes<br>
type: plex<br>
Parallel Mesh in 3 dimensions:<br>
0-cells: 445 445<br>
1-cells: 1196 1196<br>
2-cells: 1072 1072<br>
3-cells: 320 320<br>
Labels:<br>
depth: 4 strata with value/size (0 (445), 1 (1196), 2 (1072), 3 (320))<br>
<br>
$ mpirun -n 4 bin/mwe -refine 2<br>
<br>
DM Object: 4 MPI processes<br>
type: plex<br>
DM_0x20c7790_0 in 3 dimensions:<br>
0-cells: 445 445 445 445<br>
1-cells: 1196 1196 1196 1196<br>
2-cells: 1072 1072 1072 1072<br>
3-cells: 320 320 320 320<br>
Labels:<br>
depth: 4 strata with value/size (0 (445), 1 (1196), 2 (1072), 3 (320))<br>
DM Object: Parallel Mesh 4 MPI processes<br>
type: plex<br>
Parallel Mesh in 3 dimensions:<br>
0-cells: 445 445 445 445<br>
1-cells: 1196 1196 1196 1196<br>
2-cells: 1072 1072 1072 1072<br>
3-cells: 320 320 320 320<br>
Labels:<br>
depth: 4 strata with value/size (0 (445), 1 (1196), 2 (1072), 3 (320))<br>
<br>
<br>
Ellen Price<br>
<br>
<br>
On 03/25/2018 07:29 PM, Matthew Knepley wrote:<br>
> On Sun, Mar 25, 2018 at 6:32 PM, Ellen M. Price<br>
> <<a href="mailto:ellen.price@cfa.harvard.edu">ellen.price@cfa.harvard.edu</a> <mailto:<a href="mailto:ellen.price@cfa.harvard.edu">ellen.price@cfa.<wbr>harvard.edu</a>>> wrote:<br>
><br>
> I am trying to understand some unusual behavior (at least, given my<br>
> current understanding) in DMPlexDistribute. I have created a hex mesh<br>
> and distributed it using the following snippet:<br>
><br>
> /* "-refine" is an option set at runtime */<br>
> PetscOptionsGetInt(NULL, NULL, "-refine", &refine, NULL);<br>
> ierr = DMPlexCreateHexCylinderMesh(<wbr>PETSC_COMM_WORLD, refine,<br>
> DM_BOUNDARY_NONE, &dmplex); CHKERRQ(ierr);<br>
> DMView(dmplex, PETSC_VIEWER_STDOUT_WORLD);<br>
><br>
> /* this is from the examples */<br>
> ierr = DMPlexDistribute(dmplex, 0, NULL, &dmplex_dist); CHKERRQ(ierr);<br>
><br>
> if (dmplex_dist)<br>
><br>
> {<br>
><br>
> DMDestroy(&dmplex);<br>
><br>
> dmplex = dmplex_dist;<br>
><br>
> }<br>
><br>
> DMView(dmplex, PETSC_VIEWER_STDOUT_WORLD);<br>
><br>
> So I do a view operation before and after the distribute call to see how<br>
> the DM is structured. I do not understand what happens next:<br>
><br>
> $ mpirun -n 4 ./myprogram -refine 2<br>
><br>
> DM Object: 4 MPI processes<br>
> type: plex<br>
> DM_0x1f24d50_0 in 3 dimensions:<br>
> 0-cells: 445 445 445 445<br>
> 1-cells: 1196 1196 1196 1196<br>
> 2-cells: 1072 1072 1072 1072<br>
> 3-cells: 320 320 320 320<br>
> Labels:<br>
> depth: 4 strata with value/size (0 (445), 1 (1196), 2 (1072), 3 (320))<br>
> DM Object: Parallel Mesh 4 MPI processes<br>
> type: plex<br>
> Parallel Mesh in 3 dimensions:<br>
> 0-cells: 445 445 445 445<br>
> 1-cells: 1196 1196 1196 1196<br>
> 2-cells: 1072 1072 1072 1072<br>
> 3-cells: 320 320 320 320<br>
> Labels:<br>
> depth: 4 strata with value/size (0 (445), 1 (1196), 2 (1072), 3 (320))<br>
><br>
> No matter what I choose for the number of processors, every processor<br>
> has a copy of all 320 cells (at this refinement level). Similar behavior<br>
> at other refinement levels. The only thing that is changed is the name<br>
> of the DM, to "Parallel Mesh". This is not what I would have expected<br>
> given the description of DMPlexDistribute in the manual; I thought the<br>
> cells would be split up between all available processors.<br>
><br>
> I am also viewing this mesh in VTK and have noticed that the file size<br>
> of the output scales with the number of processors, as if it is really<br>
> writing each "copy" of the mesh and data stored in it to one big file.<br>
> Again, not what I expected.<br>
><br>
> Can someone clear up what is going on?<br>
><br>
><br>
> It definitely does not work that way. Can you send the whole code you<br>
> are running?<br>
> Its hard to tell where the problem might be from snippets. You could<br>
> also try running<br>
> an example. For instance,<br>
><br>
> cd $PETSC_DIR/src/snes/examples/<wbr>tutorials<br>
> make ex13<br>
><br>
> master *$:/PETSc3/petsc/petsc-dev/<wbr>src/snes/examples/tutorials$ ./ex13<br>
> -cells 8,8 -dm_view<br>
><br>
> DM Object: Mesh 1 MPI processes<br>
><br>
> type: plex<br>
><br>
> Mesh in 2 dimensions:<br>
><br>
> 0-cells: 81<br>
><br>
> 1-cells: 208<br>
><br>
> 2-cells: 128<br>
><br>
> Labels:<br>
><br>
> Face Sets: 1 strata with value/size (1 (32))<br>
><br>
> marker: 1 strata with value/size (1 (64))<br>
><br>
> depth: 3 strata with value/size (0 (81), 1 (208), 2 (128))<br>
><br>
> master *$:/PETSc3/petsc/petsc-dev/<wbr>src/snes/examples/tutorials$<br>
> $PETSC_DIR/arch-master-debug/<wbr>bin/mpiexec -n 4 ./ex13 -cells 8,8 -dm_view<br>
><br>
> DM Object: Mesh 4 MPI processes<br>
><br>
> type: plex<br>
><br>
> Mesh in 2 dimensions:<br>
><br>
> 0-cells: 26 26 27 25<br>
><br>
> 1-cells: 57 57 58 56<br>
><br>
> 2-cells: 32 32 32 32<br>
><br>
> Labels:<br>
><br>
> Face Sets: 1 strata with value/size (1 (8))<br>
><br>
> marker: 1 strata with value/size (1 (17))<br>
><br>
> depth: 3 strata with value/size (0 (26), 1 (57), 2 (32))<br>
><br>
> master *$:/PETSc3/petsc/petsc-dev/<wbr>src/snes/examples/tutorials$<br>
> $PETSC_DIR/arch-master-debug/<wbr>bin/mpiexec -n 7 ./ex13 -cells 8,8 -dm_view<br>
><br>
> DM Object: Mesh 7 MPI processes<br>
><br>
> type: plex<br>
><br>
> Mesh in 2 dimensions:<br>
><br>
> 0-cells: 17 16 16 17 17 17 19<br>
><br>
> 1-cells: 34 33 33 35 34 35 35<br>
><br>
> 2-cells: 18 18 18 19 18 19 18<br>
><br>
> Labels:<br>
><br>
> Face Sets: 1 strata with value/size (1 (5))<br>
><br>
> marker: 1 strata with value/size (1 (11))<br>
><br>
> depth: 3 strata with value/size (0 (17), 1 (34), 2 (18))<br>
><br>
><br>
> Thanks,<br>
><br>
> Matt<br>
> <br>
><br>
><br>
> Ellen Price<br>
><br>
><br>
><br>
<span class="HOEnZb"><font color="#888888">><br>
> --<br>
> What most experimenters take for granted before they begin their<br>
> experiments is infinitely more interesting than any results to which<br>
> their experiments lead.<br>
> -- Norbert Wiener<br>
><br>
> <a href="https://www.cse.buffalo.edu/~knepley/" rel="noreferrer" target="_blank">https://www.cse.buffalo.edu/~<wbr>knepley/</a> <<a href="http://www.caam.rice.edu/~mk51/" rel="noreferrer" target="_blank">http://www.caam.rice.edu/~<wbr>mk51/</a>><br>
</font></span></blockquote></div><br><br clear="all"><div><br></div>-- <br><div class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.caam.rice.edu/~mk51/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div>
</div></div>