[petsc-users] creation of parallel dmplex from a partitioned mesh

Matthew Knepley knepley at gmail.com
Mon Aug 24 15:57:49 CDT 2020


On Mon, Aug 24, 2020 at 4:27 PM Jed Brown <jed at jedbrown.org> wrote:

> Cameron Smith <smithc11 at rpi.edu> writes:
>
> > We made some progress with star forest creation but still have work to
> do.
> >
> > We revisited DMPlexCreateFromCellListParallelPetsc(...) and got it
> > working by sequentially partitioning the vertex coordinates across
> > processes to satisfy the 'vertexCoords' argument. Specifically, rank 0
> > has the coordinates for vertices with global id 0:N/P-1, rank 1 has
> > N/P:2*(N/P)-1, and so on (N is the total number of global vertices and P
> > is the number of processes).
> >
> > The consequences of the sequential partition of vertex coordinates in
> > subsequent solver operations is not clear.  Does it make process i
> > responsible for computations and communications associated with global
> > vertices i*(N/P):(i+1)*(N/P)-1 ?  We assumed it does and wanted to
> confirm.
>
> Yeah, in the sense that the corners would be owned by the rank you place
> them on.
>
> But many methods, especially high-order, perform assembly via
> non-overlapping partition of elements, in which case the "computations"
> happen where the elements are (with any required vertex data for the
> closure of those elements being sent to the rank handling the element).
>
> Note that a typical pattern would be to create a parallel DMPlex with a
> naive distribution, then repartition/distribute it.
>

As Jed says, CreateParallel() just makes the most naive partition of
vertices because we have no other information. Once
the mesh is made, you call DMPlexDistribute() again to reduce the edge cut.

  Thanks,

     Matt

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200824/113e5e7a/attachment.html>


More information about the petsc-users mailing list