<div dir="ltr"><div dir="ltr">On Mon, Mar 1, 2021 at 4:41 PM Cameron Smith <<a href="mailto:smithc11@rpi.edu">smithc11@rpi.edu</a>> wrote:</div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Thank you. That makes sense.<br>
<br>
Using DMPlexPointLocalRef from petsc master worked for me; time to debug.<br></blockquote><div><br></div><div>Let me know if it turns out that Plex is making an assumption that is non-intuitive, or if you need something</div><div>that would make conversion easier.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<a href="https://www.mcs.anl.gov/petsc/petsc-master/docs/manualpages/DMPLEX/DMPlexPointLocalRef.html" rel="noreferrer" target="_blank">https://www.mcs.anl.gov/petsc/petsc-master/docs/manualpages/DMPLEX/DMPlexPointLocalRef.html</a><br>
<br>
-Cameron<br>
<br>
On 2/26/21 8:32 AM, Matthew Knepley wrote:<br>
> On Thu, Feb 25, 2021 at 4:57 PM Cameron Smith <<a href="mailto:smithc11@rpi.edu" target="_blank">smithc11@rpi.edu</a> <br>
> <mailto:<a href="mailto:smithc11@rpi.edu" target="_blank">smithc11@rpi.edu</a>>> wrote:<br>
> <br>
> Hello,<br>
> <br>
> Bringing this thread back from the dead...<br>
> <br>
> We made progress with creation of a distributed dmplex that matches our<br>
> source mesh partition and are in need of help writing values into a<br>
> vector created from the dmplex object.<br>
> <br>
> As discussed previously, we have created a DMPlex instance using:<br>
> <br>
> DMPlexCreateFromCellListPetsc(...)<br>
> DMGetPointSF(...)<br>
> PetscSFSetGraph(...)<br>
> <br>
> which gives us a distribution of mesh vertices and elements in the DM<br>
> object that matches the element-based partition of our unstructured<br>
> mesh.<br>
> <br>
> We then mark mesh vertices on the geometric model boundary using<br>
> DMSetLabelValue(...) and a map from our mesh vertices to dmplex points<br>
> (created during dmplex definition of vtx coordinates).<br>
> <br>
> Following this, we create a section for vertices:<br>
> <br>
> > DMPlexGetDepthStratum(dm, 0, &vStart, &vEnd);<br>
> > PetscSectionCreate(PetscObjectComm((PetscObject) dm), &s);<br>
> > PetscSectionSetNumFields(s, 1);<br>
> > PetscSectionSetFieldComponents(s, 0, 1);<br>
> > PetscSectionSetChart(s, vStart, vEnd);<br>
> > for(PetscInt v = vStart; v < vEnd; ++v) {<br>
> > PetscSectionSetDof(s, v, 1);<br>
> > PetscSectionSetFieldDof(s, v, 0, 1);<br>
> > }<br>
> > PetscSectionSetUp(s);<br>
> > DMSetLocalSection(dm, s);<br>
> > PetscSectionDestroy(&s);<br>
> > DMGetGlobalSection(dm,&s); //update the global section<br>
> <br>
> We then try to write values into a local Vec for the on-process<br>
> vertices<br>
> (roots and leaves in sf terms) and hit an ordering problem.<br>
> Specifically, we make the following sequence of calls:<br>
> <br>
> DMGetLocalVector(dm,&bloc);<br>
> VecGetArrayWrite(bloc, &bwrite);<br>
> //for loop to write values to bwrite<br>
> VecRestoreArrayWrite(bloc, &bwrite);<br>
> DMLocalToGlobal(dm,bloc,INSERT_VALUES,b);<br>
> DMRestoreLocalVector(dm,&bloc);<br>
> <br>
> <br>
> There is an easy way to get diagnostics here. For the local vector<br>
> <br>
> DMGetLocalSection(dm, &s);<br>
> PetscSectionGetOffset(s, v, &off);<br>
> <br>
> will give you the offset into the array you got from VecGetArrayWrite()<br>
> for that vertex. You can get this wrapped up using DMPlexPointLocalWrite()<br>
> which is what I tend to use for this type of stuff.<br>
> <br>
> For the global vector<br>
> <br>
> DMGetGlobalSection(dm, &gs);<br>
> PetscSectionGetOffset(gs, v, &off);<br>
> <br>
> will give you the offset into the portion of the global array that is <br>
> stored in this process.<br>
> If you do not own the values for this vertex, the number is negative, <br>
> and it is actually<br>
> -(i+1) if the index i is the valid one on the owning process.<br>
> <br>
> Visualizing Vec 'b' in paraview, and the<br>
> original mesh, tells us that the dmplex topology and geometry (the<br>
> vertex coordinates) are correct but that the order we write values is<br>
> wrong (not total garbage... but clearly shifted).<br>
> <br>
> <br>
> We do not make any guarantees that global orders match local orders. <br>
> However, by default<br>
> we number up global unknowns in rank order, leaving out the dofs that we <br>
> not owned.<br>
> <br>
> Does this make sense?<br>
> <br>
> Thanks,<br>
> <br>
> Matt<br>
> <br>
> Is there anything obviously wrong in our described approach? I suspect<br>
> the section creation is wrong and/or we don't understand the order of<br>
> entries in the array returned by VecGetArrayWrite.<br>
> <br>
> Please let us know if other info is needed. We are happy to share the<br>
> relevant source code.<br>
> <br>
> Thank-you,<br>
> Cameron<br>
> <br>
> <br>
> On 8/25/20 8:34 AM, Cameron Smith wrote:<br>
> > On 8/24/20 4:57 PM, Matthew Knepley wrote:<br>
> >> On Mon, Aug 24, 2020 at 4:27 PM Jed Brown <<a href="mailto:jed@jedbrown.org" target="_blank">jed@jedbrown.org</a><br>
> <mailto:<a href="mailto:jed@jedbrown.org" target="_blank">jed@jedbrown.org</a>><br>
> >> <mailto:<a href="mailto:jed@jedbrown.org" target="_blank">jed@jedbrown.org</a> <mailto:<a href="mailto:jed@jedbrown.org" target="_blank">jed@jedbrown.org</a>>>> wrote:<br>
> >><br>
> >> Cameron Smith <<a href="mailto:smithc11@rpi.edu" target="_blank">smithc11@rpi.edu</a> <mailto:<a href="mailto:smithc11@rpi.edu" target="_blank">smithc11@rpi.edu</a>><br>
> <mailto:<a href="mailto:smithc11@rpi.edu" target="_blank">smithc11@rpi.edu</a> <mailto:<a href="mailto:smithc11@rpi.edu" target="_blank">smithc11@rpi.edu</a>>>> writes:<br>
> >><br>
> >> > We made some progress with star forest creation but still<br>
> have<br>
> >> work to do.<br>
> >> ><br>
> >> > We revisited DMPlexCreateFromCellListParallelPetsc(...)<br>
> and got it<br>
> >> > working by sequentially partitioning the vertex<br>
> coordinates across<br>
> >> > processes to satisfy the 'vertexCoords' argument.<br>
> Specifically,<br>
> >> rank 0<br>
> >> > has the coordinates for vertices with global id 0:N/P-1,<br>
> rank 1<br>
> >> has<br>
> >> > N/P:2*(N/P)-1, and so on (N is the total number of global<br>
> >> vertices and P<br>
> >> > is the number of processes).<br>
> >> ><br>
> >> > The consequences of the sequential partition of vertex<br>
> >> coordinates in<br>
> >> > subsequent solver operations is not clear. Does it make<br>
> process i<br>
> >> > responsible for computations and communications<br>
> associated with<br>
> >> global<br>
> >> > vertices i*(N/P):(i+1)*(N/P)-1 ? We assumed it does and<br>
> wanted<br>
> >> to confirm.<br>
> >><br>
> >> Yeah, in the sense that the corners would be owned by the<br>
> rank you<br>
> >> place them on.<br>
> >><br>
> >> But many methods, especially high-order, perform assembly via<br>
> >> non-overlapping partition of elements, in which case the<br>
> >> "computations" happen where the elements are (with any required<br>
> >> vertex data for the closure of those elements being sent to<br>
> the rank<br>
> >> handling the element).<br>
> >><br>
> >> Note that a typical pattern would be to create a parallel DMPlex<br>
> >> with a naive distribution, then repartition/distribute it.<br>
> >><br>
> >><br>
> >> As Jed says, CreateParallel() just makes the most naive<br>
> partition of<br>
> >> vertices because we have no other information. Once<br>
> >> the mesh is made, you call DMPlexDistribute() again to reduce<br>
> the edge<br>
> >> cut.<br>
> >><br>
> >> Thanks,<br>
> >><br>
> >> Matt<br>
> >><br>
> ><br>
> ><br>
> > Thank you.<br>
> ><br>
> > This is being used for PIC code with low order 2d elements whose<br>
> mesh is<br>
> > partitioned to minimize communications during particle<br>
> operations. This<br>
> > partition will not be ideal for the field solve using petsc so we're<br>
> > exploring alternatives that will require minimal data movement<br>
> between<br>
> > the two partitions. Towards that, we'll keep pursuing the SF<br>
> creation.<br>
> ><br>
> > -Cameron<br>
> ><br>
> <br>
> <br>
> <br>
> -- <br>
> What most experimenters take for granted before they begin their <br>
> experiments is infinitely more interesting than any results to which <br>
> their experiments lead.<br>
> -- Norbert Wiener<br>
> <br>
> <a href="https://www.cse.buffalo.edu/~knepley/" rel="noreferrer" target="_blank">https://www.cse.buffalo.edu/~knepley/</a> <<a href="http://www.cse.buffalo.edu/~knepley/" rel="noreferrer" target="_blank">http://www.cse.buffalo.edu/~knepley/</a>><br>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>