[petsc-users] Domain decomposition using DMPLEX

Danyang Su danyang.su at gmail.com
Tue Nov 26 12:24:22 CST 2019


On 2019-11-26 10:18 a.m., Matthew Knepley wrote:
> On Tue, Nov 26, 2019 at 11:43 AM Danyang Su <danyang.su at gmail.com 
> <mailto:danyang.su at gmail.com>> wrote:
>
>     On 2019-11-25 7:54 p.m., Matthew Knepley wrote:
>>     On Mon, Nov 25, 2019 at 6:25 PM Swarnava Ghosh
>>     <swarnava89 at gmail.com <mailto:swarnava89 at gmail.com>> wrote:
>>
>>         Dear PETSc users and developers,
>>
>>         I am working with dmplex to distribute a 3D unstructured mesh
>>         made of tetrahedrons in a cuboidal domain. I had a few queries:
>>         1) Is there any way of ensuring load balancing based on the
>>         number of vertices per MPI process.
>>
>>
>>     You can now call DMPlexRebalanceSharedPoints() to try and get
>>     better balance of vertices.
>
>     Hi Matt,
>
>     I just want to follow up if this new function can help to solve
>     the "Strange Partition in PETSc 3.11" problem I mentioned before.
>     Would you please let me know when shall I call this function?
>     Right before DMPlexDistribute?
>
> This is not the problem. I believe the problem is that you are 
> partitioning hybrid cells, and the way we handle
> them internally changed, which I think screwed up the dual mesh for 
> partitioning in your example. I have been
> sick, so I have not gotten to your example yet, but I will.

Hope you are getting well soon. The mesh is not hybrid, only prism cells 
layer by layer. But the height of the prism varies significantly.

Thanks,

Danyang

>
>   Sorry about that,
>
>     Matt
>
>     call DMPlexCreateFromCellList
>
>     call DMPlexGetPartitioner
>
>     call PetscPartitionerSetFromOptions
>
>     call DMPlexDistribute
>
>     Thanks,
>
>     Danyang
>
>>         2) As the global domain is cuboidal, is the resulting domain
>>         decomposition also cuboidal on every MPI process? If not, is
>>         there a way to ensure this? For example in DMDA, the default
>>         domain decomposition for a cuboidal domain is cuboidal.
>>
>>
>>     It sounds like you do not want something that is actually
>>     unstructured. Rather, it seems like you want to
>>     take a DMDA type thing and split it into tets. You can get a
>>     cuboidal decomposition of a hex mesh easily.
>>     Call DMPlexCreateBoxMesh() with one cell for every process,
>>     distribute, and then uniformly refine. This
>>     will not quite work for tets since the mesh partitioner will tend
>>     to violate that constraint. You could:
>>
>>       a) Prescribe the distribution yourself using the Shell
>>     partitioner type
>>
>>     or
>>
>>       b) Write a refiner that turns hexes into tets
>>
>>     We already have a refiner that turns tets into hexes, but we
>>     never wrote the other direction because it was not clear
>>     that it was useful.
>>
>>       Thanks,
>>
>>          Matt
>>
>>         Sincerely,
>>         SG
>>
>>
>>
>>     -- 
>>     What most experimenters take for granted before they begin their
>>     experiments is infinitely more interesting than any results to
>>     which their experiments lead.
>>     -- Norbert Wiener
>>
>>     https://www.cse.buffalo.edu/~knepley/
>>     <http://www.cse.buffalo.edu/~knepley/>
>
>
>
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ 
> <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20191126/dedbd6a8/attachment.html>


More information about the petsc-users mailing list