[petsc-users] Consistent domain decomposition between DMDA and DMPLEX

Swarnava Ghosh swarnava89 at gmail.com
Fri Mar 22 18:40:19 CDT 2019


Hi Mark and Matt,

Thank you for your responses.
 "They may have elements on the unstructured mesh that intersect with any
number of processor domains on the structured mesh. But the unstructured
mesh vertices are in the structured mesh set of vertices"
Yes, that is correct. We would want a vertex partitioning.

Sincerely,
Swarnava

On Fri, Mar 22, 2019 at 4:08 PM Mark Adams <mfadams at lbl.gov> wrote:

> Matt,
> I think they want a vertex partitioning. They may have elements on the
> unstructured mesh that intersect with any number of processor domains on
> the structured mesh. But the unstructured mesh vertices are in the
> structured mesh set of vertices. They want the partition of the
> unstructured mesh vertices (ie, matrices) to be slaved to the partitioning
> of the structured mesh.
> Do I have that right Swarnava?
> Mark
>
> On Fri, Mar 22, 2019 at 6:56 PM Matthew Knepley via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
>
>> On Thu, Mar 21, 2019 at 8:20 PM Swarnava Ghosh via petsc-users <
>> petsc-users at mcs.anl.gov> wrote:
>>
>>> Dear PETSc users and developers,
>>>
>>> I am new to DMPLEX and had a query regarding setting up a consistent
>>> domain decomposition of two meshes in PETSc.
>>> I have a structured finite difference grid, managed through DMDA. I have
>>> another unstructured finite element mesh managed through DMPLEX. Now all
>>> the nodes in the unstructured finite element mesh also belong to the set of
>>> nodes in the structured finite difference mesh (but not necessarily
>>> vice-versa), and the number of nodes in DMPLEX mesh is less than the number
>>> of nodes in DMDA mesh. How can I guarantee a consistent domain
>>> decomposition of the two meshes? By consistent, I mean that if a process
>>> has a set of nodes P from DMDA, and the same process has the set of nodes Q
>>> from DMPLEX, then Q is a subset of P.
>>>
>>
>> Okay, this is not hard. DMPlexDistribute() basically distributes
>> according to a cell partition. You can use PetscPartitionerShell() to stick
>> in whatever cell partition you want. You can see me doing this here:
>>
>>
>> https://bitbucket.org/petsc/petsc/src/e2aefa968a094f48dc384fffc7d599a60aeeb591/src/dm/impls/plex/examples/tests/ex1.c#lines-261
>>
>> Will that work for you?
>>
>>   Thanks,
>>
>>     Matt
>>
>>
>>> I look forward to your response.
>>>
>>> Sincerely,
>>> Swarnava
>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190322/708f04b4/attachment.html>


More information about the petsc-users mailing list