<html>
<head>
<meta content="text/html; charset=ISO-8859-1"
http-equiv="Content-Type">
</head>
<body text="#000000" bgcolor="#FFFFFF">
<div class="moz-cite-prefix">Brilliant, thanks a lot.<br>
<br>
Michael<br>
<br>
On 18/11/13 16:08, Matthew Knepley wrote:<br>
</div>
<blockquote
cite="mid:CAMYG4G=mvJghSofV8Miup2REsGA0atM-Fymz5C1Q6RQGZzcKoA@mail.gmail.com"
type="cite">
<meta http-equiv="Content-Type" content="text/html;
charset=ISO-8859-1">
<div dir="ltr">
<div class="gmail_extra">
<div class="gmail_quote">On Mon, Nov 18, 2013 at 9:47 AM,
Matthew Knepley <span dir="ltr"><<a
moz-do-not-send="true" href="mailto:knepley@gmail.com"
target="_blank">knepley@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0px 0px 0px
0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">
<div dir="ltr">
<div class="gmail_extra">
<div class="gmail_quote">
<div class="im">On Mon, Nov 18, 2013 at 3:24 AM,
Michael Lange <span dir="ltr"><<a
moz-do-not-send="true"
href="mailto:michael.lange@imperial.ac.uk"
target="_blank">michael.lange@imperial.ac.uk</a>></span>
wrote:<br>
<blockquote class="gmail_quote" style="margin:0px
0px 0px
0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">
<div text="#000000" bgcolor="#FFFFFF">
<div>Hi Matt,<br>
<br>
I think there is a misunderstanding here. I
am referring to the case where
DMPlexDistribute() is run with overlap=1
(which is not the case in SNES ex12) and
vertices in the overlap/halo region are
assigned to the wrong rank. This can lead to
a case where a proc may own a vertex that is
not in its original (non-overlapping)
partition, although the attached cell is not
owned and will be marked as "ghost" by
DMPlexConstructGhostCells().<br>
</div>
</div>
</blockquote>
</div>
</div>
</div>
</div>
</blockquote>
<div><br>
</div>
<div>Your fix is now merged to next:</div>
<div><br>
</div>
<div> <a moz-do-not-send="true"
href="https://bitbucket.org/petsc/petsc/branch/knepley/fix-plex-partition-overlap">https://bitbucket.org/petsc/petsc/branch/knepley/fix-plex-partition-overlap</a></div>
<div><br>
</div>
<div> Matt</div>
<div> </div>
<blockquote class="gmail_quote" style="margin:0px 0px 0px
0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">
<div dir="ltr">
<div class="gmail_extra">
<div class="gmail_quote">
<div class="im">
<blockquote class="gmail_quote" style="margin:0px
0px 0px
0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">
<div text="#000000" bgcolor="#FFFFFF">
<div> To illustrate this, I have attached an
example consisting of a unit square with 3
faces in each dimension and a section with
only vertex dofs. If run with two ranks,
rank 1 will own all its vertices (13 roots),
whereas rank 0 only owns vertices not in the
overlap/halo of rank 1 (3 roots). My
understanding is that, since the original
partition splits the square along its
diagonal, the vertex distribution should be
10 to 6 with the 4 diagonal vertices
assigned to rank 1 and all other vertices
assigned according to the original
partition. Is this correct, or am I missing
something here?<br>
</div>
</div>
</blockquote>
<div><br>
</div>
</div>
<div>I have simplified the example so that I can
easily do things in my head. Now we just have 2
faces per side. Here is the run with overlap = 0:</div>
<div><br>
</div>
<div>
<div>next *$:/PETSc3/petsc/petsc-dev$
/PETSc3/petsc/petsc-dev/arch-c-exodus-next/bin/mpiexec
-host localhost -n 2
/PETSc3/petsc/petsc-dev/arch-c-exodus-next/lib/plex_overlap-obj/plex_overlap
-dm_view -overlap 0</div>
<div>Parallel Mesh in 2 dimensions:</div>
<div> 0-cells: 6 6</div>
<div> 1-cells: 9 9</div>
<div> 2-cells: 4 4</div>
<div>Labels:</div>
<div> depth: 3 strata of sizes (6, 9, 4)</div>
<div> exterior_facets: 1 strata of sizes (4)</div>
<div> marker: 2 strata of sizes (9, 3)</div>
<div class="im">
<div>PetscSF Object: 2 MPI processes</div>
<div> type: basic</div>
<div> sort=rank-order</div>
<div> [0] Number of roots=19, leaves=5, remote
ranks=1</div>
<div>
[0] 4 <- (1,6)</div>
<div> [0] 5 <- (1,8)</div>
<div> [0] 7 <- (1,9)</div>
<div> [0] 10 <- (1,13)</div>
<div> [0] 11 <- (1,17)</div>
<div> [1] Number of roots=19, leaves=0, remote
ranks=0</div>
</div>
</div>
<div><br>
</div>
<div>Each partition gets 4 cells and 6 vertices
since it is split along the diagonal. The overlap</div>
<div>region contains the 3 vertices and 2 faces that
lie on the diagonal, and they are all owned by
proc 1.</div>
<div>Now if we run with an overlap of 1:</div>
<div><br>
</div>
<div>
<div>next *$:/PETSc3/petsc/petsc-dev$
/PETSc3/petsc/petsc-dev/arch-c-exodus-next/bin/mpiexec
-host localhost -n 2
/PETSc3/petsc/petsc-dev/arch-c-exodus-next/lib/plex_overlap-obj/plex_overlap
-dm_view -overlap 1</div>
<div>Parallel Mesh in 2 dimensions:<br>
</div>
<div> 0-cells: 8 8</div>
<div> 1-cells: 13 13</div>
<div> 2-cells: 6 6</div>
<div>Labels:</div>
<div> depth: 3 strata of sizes (8, 13, 6)</div>
<div> exterior_facets: 1 strata of sizes (6)</div>
<div> marker: 2 strata of sizes (13, 5)</div>
<div class="im">
<div>PetscSF Object: 2 MPI processes</div>
<div> type: basic</div>
<div> sort=rank-order</div>
</div>
<div> [0] Number of roots=27, leaves=19, remote
ranks=1</div>
<div> [0] 0 <- (1,1)</div>
<div> [0] 2 <- (1,4)</div>
<div> [0] 6 <- (1,7)</div>
<div> [0] 7 <- (1,8)</div>
<div> [0] 8 <- (1,9)</div>
<div> [0] 9 <- (1,10)</div>
<div> [0] 10 <- (1,11)</div>
<div> [0] 11 <- (1,12)</div>
<div>
[0] 12 <- (1,13)</div>
<div> [0] 14 <- (1,17)</div>
<div> [0] 15 <- (1,18)</div>
<div> [0] 16 <- (1,19)</div>
<div> [0] 17 <- (1,20)</div>
<div> [0] 18 <- (1,21)</div>
<div> [0] 19 <- (1,22)</div>
<div> [0] 20 <- (1,23)</div>
<div> [0] 21 <- (1,24)</div>
<div> [0] 25 <- (1,25)</div>
<div> [0] 26 <- (1,26)</div>
<div> [1] Number of roots=27, leaves=2, remote
ranks=1</div>
<div> [1] 3 <- (0,1)</div>
<div> [1] 5 <- (0,5)</div>
</div>
<div><br>
</div>
<div>Each process gets 2 more cells (those who faces
lie on the diagonal), 2 more vertices and 4 more
edges. This is correct. The</div>
<div>two overlap cells are ghost for proc 1, but the
4 edges and 2 vertices are owned. So you are
correct, I need to mark all those</div>
<div>overlap points as "unownable" by the original
process.</div>
<div><br>
</div>
<div> Thanks for finding this,</div>
<div><br>
</div>
<div> Matt</div>
<div>
<div class="h5">
<div> </div>
<blockquote class="gmail_quote"
style="margin:0px 0px 0px
0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">
<div text="#000000" bgcolor="#FFFFFF">
<div> Many thanks for all your help<br>
Michael<br>
<br>
On 16/11/13 13:54, Matthew Knepley wrote:<br>
</div>
<blockquote type="cite">
<div dir="ltr">
<div class="gmail_extra">
<div class="gmail_quote">
<div>On Sat, Nov 16, 2013 at 7:22
AM, Michael Lange <span dir="ltr"><<a
moz-do-not-send="true"
href="mailto:michael.lange@imperial.ac.uk"
target="_blank">michael.lange@imperial.ac.uk</a>></span>
wrote:<br>
<blockquote class="gmail_quote"
style="margin:0px 0px 0px
0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">Hi,<br>
<br>
I notice that, when creating the
point SF for the parallel
partition in DMPlexDistribute,
cells are assigned to procs
according to the original
partition but vertices aren't.
Was this done by design or is
this a bug?<br>
</blockquote>
<div><br>
</div>
</div>
<div>If this were true, there would
be no communication for the P1
test of SNES ex12. Here is running
it with</div>
<div>-interpolate 1 and -dm_view
::ascii_info_detail</div>
<div><br>
</div>
<div>
<div>PetscSF Object: 2 MPI
processes</div>
<div> type: basic</div>
<div> sort=rank-order</div>
<div> [0] Number of roots=19,
leaves=5, remote ranks=1</div>
<div> [0] 4 <- (1,6)</div>
<div> [0] 5 <- (1,8)</div>
<div> [0] 7 <- (1,9)</div>
<div> [0] 10 <- (1,13)</div>
<div> [0] 11 <- (1,17)</div>
<div> [1] Number of roots=19,
leaves=0, remote ranks=0</div>
<div> [0] Roots referenced by my
leaves, by rank</div>
<div> [0] 1: 5 edges</div>
<div> [0] 4 <- 6</div>
<div> [0] 5 <- 8</div>
<div> [0] 7 <- 9</div>
<div> [0] 10 <- 13</div>
<div> [0] 11 <- 17</div>
<div> [1] Roots referenced by my
leaves, by rank</div>
</div>
<div><br>
</div>
<div>So there are 3 vertices and 2
edges in the point SF.</div>
<div><br>
</div>
<div> Matt</div>
<div>
<div> </div>
<blockquote class="gmail_quote"
style="margin:0px 0px 0px
0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">In
case it is a bug, I have
attached a patch that fixes this
by using the closure of the
original partition instead.<br>
<br>
Thanks and kind regards<span><font
color="#888888"><br>
Michael<br>
</font></span></blockquote>
</div>
</div>
<br>
<br clear="all">
<span><font color="#888888">
<div><br>
</div>
-- <br>
What most experimenters take for
granted before they begin their
experiments is infinitely more
interesting than any results to
which their experiments lead.<br>
-- Norbert Wiener </font></span></div>
</div>
</blockquote>
<br>
</div>
</blockquote>
</div>
</div>
</div>
<div>
<div class="h5"><br>
<br clear="all">
<div><br>
</div>
-- <br>
What most experimenters take for granted before
they begin their experiments is infinitely more
interesting than any results to which their
experiments lead.<br>
-- Norbert Wiener
</div>
</div>
</div>
</div>
</blockquote>
</div>
<br>
<br clear="all">
<div><br>
</div>
-- <br>
What most experimenters take for granted before they begin
their experiments is infinitely more interesting than any
results to which their experiments lead.<br>
-- Norbert Wiener
</div>
</div>
</blockquote>
<br>
</body>
</html>