[petsc-users] creation of parallel dmplex from a partitioned mesh

Jed Brown jed at jedbrown.org
Mon Aug 24 15:27:21 CDT 2020


Cameron Smith <smithc11 at rpi.edu> writes:

> We made some progress with star forest creation but still have work to do.
>
> We revisited DMPlexCreateFromCellListParallelPetsc(...) and got it 
> working by sequentially partitioning the vertex coordinates across 
> processes to satisfy the 'vertexCoords' argument. Specifically, rank 0 
> has the coordinates for vertices with global id 0:N/P-1, rank 1 has 
> N/P:2*(N/P)-1, and so on (N is the total number of global vertices and P 
> is the number of processes).
>
> The consequences of the sequential partition of vertex coordinates in 
> subsequent solver operations is not clear.  Does it make process i 
> responsible for computations and communications associated with global 
> vertices i*(N/P):(i+1)*(N/P)-1 ?  We assumed it does and wanted to confirm.

Yeah, in the sense that the corners would be owned by the rank you place them on.

But many methods, especially high-order, perform assembly via non-overlapping partition of elements, in which case the "computations" happen where the elements are (with any required vertex data for the closure of those elements being sent to the rank handling the element).

Note that a typical pattern would be to create a parallel DMPlex with a naive distribution, then repartition/distribute it.


More information about the petsc-users mailing list