[petsc-users] creation of parallel dmplex from a partitioned mesh
Cameron Smith
smithc11 at rpi.edu
Mon Aug 24 15:18:33 CDT 2020
We made some progress with star forest creation but still have work to do.
We revisited DMPlexCreateFromCellListParallelPetsc(...) and got it
working by sequentially partitioning the vertex coordinates across
processes to satisfy the 'vertexCoords' argument. Specifically, rank 0
has the coordinates for vertices with global id 0:N/P-1, rank 1 has
N/P:2*(N/P)-1, and so on (N is the total number of global vertices and P
is the number of processes).
The consequences of the sequential partition of vertex coordinates in
subsequent solver operations is not clear. Does it make process i
responsible for computations and communications associated with global
vertices i*(N/P):(i+1)*(N/P)-1 ? We assumed it does and wanted to confirm.
Thank-you,
Cameron
On 8/13/20 11:43 AM, Cameron Smith wrote:
> Thank you for the quick reply and the info. We'll give it a shot and
> respond if we hit any snags.
>
> -Cameron
>
> On 8/13/20 9:54 AM, Matthew Knepley wrote:
>> On Thu, Aug 13, 2020 at 9:38 AM Cameron Smith <smithc11 at rpi.edu
>> <mailto:smithc11 at rpi.edu>> wrote:
>>
>> Hello,
>>
>> We have a partitioned mesh that we want to create a DMPlex from that
>> has
>> the same distribution of elements (i.e., assignment of elements to
>> processes) and vertices 'interior' to a process (i.e., vertices
>> not on
>> the inter-process boundary).
>>
>> We were trying to use DMPlexCreateFromCellListParallelPetsc() or
>> DMPlexBuildFromCellListParallel() and found that the vertex ownership
>> (roots in the returned Vertex SF) appears to be sequentially
>> assigned to
>> processes based on global vertex id. In general, this will not match
>> our mesh distribution. As we understand, to subsequently set vertex
>> coordinates (or other vertex data) we would have to utilize a star
>> forest (SF) communication API to send data to the correct process. Is
>> that correct?
>>
>> Alternatively, if we create a dmplex object from the elements that
>> exist
>> on each process using DMCreateFromCellList(), and then create a SF
>> from
>> mesh vertices on inter-process boundaries (using the mapping from
>> local
>> to global vertex ids provided by our mesh library), could we then
>> associate the dmplex objects with the SF? Is it as simple as calling
>> DMSetPointSF()?
>>
>>
>> Yes. If you have all the distribution information, this is the easiest
>> thing to do.
>>
>> If manually defining the PointSF is a way forward, we would like some
>> help understanding its definition; i.e., which entities become roots
>> and
>> which become leaves. In DMPlexBuildFromCellListParallel()
>>
>>
>> Short explanation of SF:
>>
>> SF stands for Star-Forest. It is a star graph because you have a
>> single root that points to multiple leaves. It is
>> a forest because you have several of these stars. We use this
>> construct in many places in PETSc, and where it
>> is used determines the semantics of the indices.
>>
>> The DMPlex point SF is an SF in which root indices are "owned" mesh
>> points and leaf indices are "ghost" mesh
>> points. You can take any set of local Plexes and add an SF to make
>> them a parallel Plex.
>>
>> The SF is constructed with one-sided data. Locally, each process
>> specifies two things:
>>
>> 1) The root space: The set of indices [0, Nr) which refers to
>> possible roots on this process. For the pointSF, this is [0, Np) where
>> Np is the number of local mesh points.
>>
>> 2) The leaves: Each leaf is a pair (local mesh point lp, remote
>> mesh point rp) which says that local mesh point lp is a "ghost" of
>> remote point rp. The remote point is
>> given by (rank r, local mesh point rlp) where rlp is the local
>> mesh point number on process r.
>>
>> With this, the Plex will automatically create all the other structures
>> it needs.
>>
>>
>> https://gitlab.com/petsc/petsc/-/blob/753428fdb0644bc4cb7be6429ce8776c05405d40/src/dm/impls/plex/plexcreate.c#L2875-2899
>>
>>
>> the PointSF appears to contain roots for elements and vertices and
>> leaves for owned vertices on the inter-process boundary. Is that
>> correct?
>>
>>
>> No, the leaves are ghost vertices. They point back to the owner.
>>
>> Thanks,
>>
>> Matt
>>
>> Thank-you,
>> Cameron
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which
>> their experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
More information about the petsc-users
mailing list